Intelligent Empathy: The Next Horizon of User Experience

Chris Lahiri
6 min readJan 27, 2019

--

“We appreciate your patience. A customer service representative will be with you momentarily.”

The above statement has been, in some way, shape or form, ingrained into the memory of consumers over the last several decades. It’s the language of automated telephone and text bot support systems the world over. It’s certainly better than nothing, but after hearing it repeatedly, with no further utility such as a countdown in minutes of reduced wait time, our senses register something similar to banner blindness on an empathetic level: We lose faith in its sincerity.

In the advertising world, Pepsi, Gillette, and Nike are all excellent case studies of failure and success in the broader context of empathy. Much has already been stated about marketers and their recent attempt to show consumers they care about one or another moral perspective while simultaneously attempting to hawk their wares. The challenge of empathy and sincerity indeed remain a genie in a bottle for an industry that had long since mastered manipulating human perception quickly and with deadly accuracy. Taking advantage of both innate and learned reward-response mechanisms, this traditionally meant appealing to greed, lust, and gluttony; you know, the “seven deadly”, or some variation on them.

For product designers, the challenge of empathy also exists. But in the spirit of Valentine’s Day let’s talk about love, not lust; need, not greed. Sleazy marketers be damned. We are in a nascent time in the UX practice, in which we are faced with recognizing where empathy can be applied in the first place: We express it implicitly in easing friction to help users accomplish various tasks, but also, we find it when we overtly communicate the system state. An opportunity exists here to craft a message, that not only resonates with the user, but might even provide greater engagement, immersion and conversion. On the other hand, if done as it is popularly handled today, we have the potential to alienate the user beyond hope.

Returning to our hold message example, we realize that robotic attempts to state that a system “cares” about the user only last ephemerally; that even the most sincere intonation of a message can be trumped by constant repetition. Basically, any situation where the user is awkwardly reminded that a machine is involved becomes the tipping point between genuine delight, tolerable machine feigning, and grotesque mechanical pandering to a recipient that has since been desensitized.

As UX practitioners, we’re often the sole voice in the room advocating for the end-user, and as such, we must be deeply in tune with human behavior. This means uncomfortably thinking about and being in touch with our own humanity, not just plugging in “best practices” surrounding user behavior. The latter is a formula for disaster. The culture of the last two decades has seen a decrease, rather than increase in the ability to empathize with others. Cultural disparities abound in often violent manifestations. I believe this is partially due to living in past eras where humanity has increasingly emulated machine behavior rather than vice-versa. The pressure has been on since the Industrial Age to conform to primitive machines. Really, would you want a box of chocolate made at this factory? We’re increasingly aware of when systems and their product aren’t empathetic, if for no other reason than the fact that we’re inundated with manufactured behavior surrounding manufactured products. Contemporary society is in a desperate search for authenticity, artisanship and craftsmanship.

The potential remains that we might merely be echoing current, rather than potential behavioral trends when following language and UI patterns that have worked in the past. We have it within us to keenly detect what is sincere and insincere; genuine and disingenuous. Our humanity is part of our analysis of a system that can be utilized carefully and in a controlled fashion as well. Indeed, we have the power to craft and fine-tune the next century of human-computer interactions, and define synthetic communication of the system state to the human user well beyond what we have today.

Some practical ways to begin doing this is to seek out those interaction points and apply a few techniques:

Utility and Variation: In the above example, the hold message is tolerable. However, any sincerity is stripped once the message is repeated several times. Repetition is often inevitable, but when it cannot be avoided, we can bring utility and variation to help the user understand what they’re facing, what their options are, and where to go next.

Hence we now have messages like this:

“We appreciate your patience. There are current five people ahead of you. Your wait time is approximately eight minutes. If you do not wish to hold, press 1 to have a representative return your call.”

This is an example of bringing greater utility, something machines are infinitely better capable of than humans, much to our knowledge. Can this be further improved upon? Surely it can be, as we take into consideration the temporal element of wait time and repetition, and apply variation in the course of the experience.

In the above example, let’s assume the user has chosen to wait rather than be called back. He has decided that the wait time is sufficient and may well have put the phone on speaker to hear out the call until a representative picks up the line. The system, in order to defeat the prospect of appearing insincere in it’s appreciation of the human wait time, could change its message by removing the repeating statement of appreciation, a subtle but less mechanically pandering behavior:

“We appreciate your patience. There are current five people ahead of you. Your wait time is approximately eight minutes. If you do not wish to hold press 1 to have a representative return your call.”

This bring us to our second technique: weeding out disingenuous communication over the length of an experience. The system doesn’t appreciate the user’s patience. It never did, and the end user is well aware of it. The first time it was tolerable to hear this myth, the second time, less so, the third time, “Shut up with your bullshit and get me a human being on the line!” In the mind of the writer of this message, the system perhaps represents the company when stating the appreciation of patience; however in the mind of even the most remotely savvy user, they are interacting with a machine that can only grow more disingenuous with time.

The bottom line is this: Before entering the uncanny valley of communication, we might stop to consider a more sophisticated end user. As we fashion system state communication, we often choose the path of fooling the user into thinking the message is otherwise human, when in fact people are fine with it being that of an obvious machine, as long as utility is provided. As stated, it’s providing advanced utility that machines are far better at doing than people in the first place.

“Let people be people. Let machines remain machines.

Too much emphasis on the “synthetic apparition” to connect with the end user is a hallmark of the last century in human computer interaction. Let people be people. Let machines remain machines. There are far deeper reasons why we remain strangely distant from the goal of bridging the gap between man and his creepy machine counterpart. We desensitize ourselves to the nuances of human behavior and have become more machine-like, more unempathetic, and have dulled our emotional intelligence to live and love in a world less nuanced than generations prior. It’s reflected in our music, our daily expression, in many activities that could do with greater care and authenticity.

How often have you had to slow down your voice and remove vocal inflections to sound more robotic, just so a digital assistant like Alexa or Siri could understand what you want? In what other ways have machine behavior influenced human behavior rather than vice versa?

--

--

Chris Lahiri

I'm a product designer. I'm also a dad and fathers’ rights activist.