flyer pasted on a post that reads "Big Data Is Watching You"
PHOTO: ev | unsplash

For some time now, artificial intelligence (AI) has been seen as a tool that can use dozens of data points to enhance customer service and experience. But its true potential has yet to be revealed — and the same is true for its vulnerabilities.

Regardless of the unknowns, things are quickly reaching a tipping point. Thanks to a number of factors, businesses are running out of time to make a decision about their use of AI.

  • Not using AI will be a crippling mistake. According to a Forbes Insights survey, a majority of corporate leaders (86%) agree businesses will face a “significant disadvantage” in the next five to 10 years if they don’t incorporate AI.
  • Consumers want personalization. In one study, 80% of consumers reported that they like it when a brand’s offerings are personalized according to their needs and interests. And 70% said they like it when offerings are based on their previous interactions.
  • Personalization requires data that can only be delivered via AI. For organizations to improve the user experience, they need to acquire and analyze consumers’ shopping preferences based on their data trail (big data).
  • Consumers are unaware how their data is collected and used. The percentage of consumers completely reading and understanding the privacy and disclosure agreement is lower now than in years past

Consumers are becoming increasingly concerned about both security and privacy, especially when they read articles like this one that explain how companies use online activity to predict future behavior. They’re also concerned about security, as data breaches continue to make headlines. And they’re right to be concerned: More than 105 million Americans experienced cybercrime over the past year, coming in at about 27 victims every second.

When you consider these factors, it becomes clear that businesses really have only one choice: Continue to collect and use data, but do so ethically, with the consumers’ best interests in mind. The businesses that win over the next decade will be those that embrace “ethics by design,” meaning ethics is at the heart of what they do with data, not just an item on a checklist.

Related Article: How to Overcome Resistance to Digital Policies and Standards

What Do 'Ethical by Design' Data Policies Look Like?

No one expects businesses to use data in ways that aren’t profitable, but they do expect businesses to use data in ways that provide value to consumers while not causing harm. Accomplishing that goal isn’t easy. It takes commitment and constant self-monitoring, but it can be done.

Digital policies, therefore, have to consider the nature of the data, its role in achieving the business’s goals, and the protocols the organization must take to safeguard the data. For example, organizations should prioritize their databases based on the value they bring to the business, as well as the risks associated with collecting and holding the data (i.e., data breach).

Related Article: Marketers, Data Collection and the E-Word: Ethics

Foundations of Ethical Data Use

1. Collect — and Use — Only as Much Data as You Need to Offer Value to the Consumer

If you sell high-end furnishings, you don’t need to store a consumer’s search for “divorce lawyer” or their Pandora playlists. Just because cookies and trackers can give you that information, doesn’t mean you should take it. Not using it isn’t enough: if you store it, it’s vulnerable — and so is the privacy of the person it belongs to.

2. Use Only Data You Get Directly from the Consumer

Not too long ago, the data organizations used to drive business decisions came directly from the consumers themselves — when they filled out a form for a newsletter, used their email address to receive a receipt for online purchases, signed up for a loyalty program, etc. Now, AI can stitch together data from dozens of actions, creating a more complete profile than customers realize. When that happens, personalization starts to feel creepy.

On the other hand, the correct and appropriate use of data can drastically improve results. That’s why using consumer data with integrity makes good business sense.

Businesses may struggle to find that line, but consumers know it when they see it:

  • Being greeted by name when you walk into a brick-and-mortar store.
  • Receiving offers based on your purchase history at another brand.
  • Having your children receive text or phone calls from a brand you’ve dealt with.

The result of any of these actions? Consumers won’t trust you anymore. And once you lose their trust, your competitors will be waiting in the wings, ready to earn it (along with their money) in a transparent exchange of data for value. 

Related Article: Marketers Are Missing the Point of the GDPR – and the Opportunity

3. Do No Harm

Every piece of data you collect, analyze and store carries some degree of risk for both your organization and your customers. An important part of an “ethical by design” approach is to ask yourself, “What could go wrong?” Here are some examples:

  • A breach could expose sensitive customer data. This encompasses two points: First, don’t store sensitive data if you don’t need it. And, if you do need it, follow proper security protocols like anonymization or pseudonymization.

  • Your AI is based on unintentional bias. If you’re training your AI to detect emotion on human faces (up to 95% of purchasing decisions are influenced by emotions), and the faces that make up your training data don’t include a wide enough range of ethnicity, the conclusions it draws will be flawed. And those flawed conclusions will only compound over time.

    Avoiding bias is even more important when artificial intelligence drives decisions, such as loan or employment applications. But it can just as easily derail a marketing campaign by making false assumptions about consumer wants and needs.

    The best solution? Diverse development and marketing teams will help you avoid the pitfall of “garbage in, garbage out.”

  • You place profit over the social contract you have with consumers. When you give revenue top priority, anything can be justified. You can convince yourself that moving the ethical line by a hair or two won’t really make a difference. Move it a few more hairs, however, and customers will revolt. They’ll realize you’ve betrayed their trust. They’ll take their business elsewhere, and you won’t get them back. Trust plays a big role in business, which is why there’s such a strong business case for committing to the ethical use of data.

The Future Belongs to the Ethical

It’s not that consumers don’t know you’re collecting and using their data. They do. And as long as they feel they’re getting a good deal — be it personalized offerings, special coupons, custom entertainment, etc. — they’ll stick around. But the minute they feel like they’re getting ripped off — that you’re not protecting their data, or that you’re using it in a way that benefits you a lot more than it benefits them — they’ll disappear. Ethical use of data assumes an equal exchange of value. And, with so many brands to choose from, the businesses that sincerely put ethics at the heart of their business processes will reap the benefits.