Today, more users are comfortable with companies using relevant personal information transparently and beneficially. In the Salesforce State of the Connected Customer study, 61% of survey respondents reported a level of comfort with these personalized user experience practices. That’s up from 52% the year prior. However, there’s no universal definition of what’s transparent and beneficial.
Design leaders are often the ones who end up determining what an ideal user experience looks like – and what it doesn’t look like. Let’s consider a few examples for context.
- An account executive on a customer call sees a chatbot open up with advice to change their tone to be more effective. This can feel like it’s inserting itself in an unrequested way.
- A consumer searches for, say, wedding ring jewelers on their laptop web browser. They later see an AI-targeted ad for event planners on their mobile social-media feed.
Many people describe experiences like these as creepy. It can cross a line when AI uses information or user behavior from a different location, experience, or device, without a clear explanation or consent process. However, that line can blur based on the user and their background.
Discover what’s shaping the future of customer engagement
Get data-driven insights from 13,000+ customers in the free report, The Salesforce State of the Connected Customer.
“Users need to feel that there’s a mutually beneficial relationship between the company and themselves as they exchange their data for personalized services,” said Daniel Lim, Salesforce Futures Senior Director, who also worked on the Ethics by Design report with the World Economic Forum (WEF).
Reciprocate for personal data
Lim reminds designers that the value exchange today is still new. Today, customers get more personalized user experiences while companies gather data and gain insights. Yet not too long ago, customers simply paid for goods and services without giving up personal data.
It’s a new day of information everywhere. By 2028, 221 zettabytes of data will exist worldwide, according to the International Data Corporation (IDC). There are also powerful tools in the market to unlock customer data and actionable insights such as our new Salesforce Data Cloud.
Knowing this, Lim recognizes the act of mutually beneficial design to be an emerging design principle for building a future with more trust. He stresses this topic cannot be overlooked. “The internet changed trust forever,” he said.
Observe the ethical environment
He spoke about this on the podcast series, Voicing Relationships, produced in partnership with House of Beautiful Business, the network for a life-centered economy. Besides the creepy yet cool distinction of being specific to the user, it’s also specific to society.
“What is good in society is completely contextual,” said Lim. “What’s ethical in a society is completely contextual, as well.”
Similarly, human-centered design offers a lens to inquire deeper about what’s important to users. Instead of making assumptions, having conversations can reveal insights that inform what personalized user experiences look like. “The future of personalization really depends on the use of technology to sense and respond to the context of each individual – understanding what their needs are and responding without explicit instruction,” said Lim.
Certainly, designers are increasingly aware of how to avoid deceptive designs. Now, in this new data and AI-driven era, it’s also critical to avoid designs that users might describe as creepy.
Prevent user discomfort
Incorporating ethics into design processes is a topic Emily Witt champions in her role as Salesforce Director of Research & Insights. “Our customers trust us to do the hard work of creating trustworthy experiences. We need to make sure we avoid creating experiences that feel invasive, manipulative, or anxiety-provoking.”
She’s found that explanations can go a long way to increasing comfort. Using language that puts users at ease might look like sharing why you’re requesting private information or what information you’re using to generate an output. It also helps to clearly communicate when companies employ AI — a transparent disclosure outlined in the Salesforce Acceptable Use Policy.
For example, when naming Tracey Bot, the contact-tracing Einstein Bot, the team intentionally included “Bot.” They did this to make it clear that AI is contacting them about their health. With personal medical information, there’s a risk of falling into what’s known as an uncanny valley. To avoid user discomfort, the team built in direction for organizations to only send a message if an agent can handle inbound calls. This example shows how human-centered design helps create personalized user experiences with data and AI that meet user expectations. They can even spark positive sentiment.
Generally, “don’t be creepy” is a common refrain in the tech industry. Customer companies can consider how users perceive the data-for-personalization value exchange. It requires responsible marketing personalization and responsible AI design principles. In fact, that’s how to pave the way toward increasing customer trust and loyalty. It all starts with designing solutions that keep these emerging ideas top of mind.