Wearing Your Heart on Your Interface

By Pamela Pavliscak

Imagine you have a tight work deadline. You grimace and sigh. It’s late but you still have several hours of work ahead.

When you open your calendar to schedule a meeting for 9am tomorrow morning, it suggests alternate times knowing that you are likely not to be in the best state of mind. Your smartphone notifications are automatically turned off, even the ads for sleep meds are temporarily disabled.

As you walk to the kitchen to get a snack, the lighting remains dimmed so that you aren’t jolted awake. The smart fridge spotlights the milk and fruit rather than the leftover cake.

It’s the Internet of Emotionally Intelligent Things.

The excitement around emotion-sensing technology is palpable, which is to say, probably detectable. The possibilities are tantalizing. More than the intentional act of ❤ing a favorite post on the newly emotive Twitter or, in my case, near compulsive wowing on Facebook, technology may start to provide new revelations into unguarded emotional reactions.

Facial expressions can be mapped to a few core emotions reliably. Tone of voice can now be decoded for emotional timbre. Blood pressure, heart rate, skin temperature can be translated to mood. Even humble textual analysis is moving beyond the happy face sad face of social listening past.

Our devices already track all kinds of data points and try to assemble those into a holistic identity for us. We are starting to get comfortable designing for that hybrid algorithmically-defined person, that extended self that lives through technology. But it’s a little off. Perhaps, adding an emotional layer will help us make the leap across the uncanny valley to something more genuine.

This Emotional Digital Life

The human-machine relationship is not purely rational, it’s emotional. People tend to treat computers as if they are real people. If you’ve ever talked back to your GPS or said goodnight to Siri, you know this feeling. You might think of your Roomba as a pet, or give a name to your laptop.

A few human elements, barely present, are all that’s needed to establish an emotional connection. It may be intentional, like the Macintosh Color Classic's baby face with the large inset monitor as eyes and disk drive as a mouth. Experiences that aren’t embodied can be emotional too. Conversations with chatbots can feel like a real dialogue. No matter that the personality is an amalgam of data points and a trans-global design team's best intentions and poets and maybe Turkers.

 

Koko UX design interface

Koko, an emotional well-being app, is now be used to train chatbots for empathy.

 

Sometimes the feeling is not so explicit. Maybe we don’t assign a personality to our favorite app, but we certainly know how it makes us feel. With smartphones, it goes a step further. We aren’t sure if the phone is a best friend, a saboteur, or an extension of ourselves. The emotional undercurrent to our relationship with technology is real, whether we admit it or not. So shouldn’t we design for that?

Design, with Feeling

In other creative disciplines, emotion is an explicit goal. Branding consultancies are very much in touch with the feelings they want to embody, advertising agencies are all about the feelings an ad will evoke. The emotional experience people have with technology often begins with advertising campaigns, by putting 1000 songs in your pocket for instance.

In the tech fields, we do design for emotion even if we aren’t always explicit about it. Engagement has become code for emotional design. It typically means finding a pain point and then creating an experience that addresses it. Not permanently, because the cycle has to be repeatable.

That’s how engagement is measured, after all. Without fully acknowledging it, we design for negative emotions like boredom or anxiety or FOMO. Design kills the pain temporarily, and then, if it’s successful, repeatedly. What seems to be happening though, is that designing to alleviate pain doesn’t alleviate pain. Instead it cultivates a cycle of negative emotion. The good feeling doesn’t last. The false sense of urgency creates a new source of stress. And the good feeling might even not be that good, after all. Plenty of academic research power has been devoted to deciphering the fear of being left out, the envy of social comparison, and the disappointment of surface-skimming conversation.

The inner narrative people have about technology is about addiction. It’s the topic of countless articles, reinforcing the technology-as-pathology mental model. It’s the subject of over 100 TEDx talks admonishing you to look up from your phone at least once in a while to stave off the slow decline of your humanity. From the tech industry, it’s a shrug and maybe an app to control the urge to open the other app.

Sweet Micro-Somethings

The pursuit of engagement leads us to design for emotion implicitly, without labelling it emotional design. Emotional design, instead, has become synonymous with delight. We explicitly design for delight. By explicitly, I mean intentionally but also visibly.

While we could design for emotional support, or for the symbolic meaning that accrues over time, it’s not clear exactly how to do that. What we can do is create pops of joy in images or text. Over the past decade, delight has taken on its own well-documented character. It’s friendly, a little cheeky, and usually illustrated and embodied in a quirky cartoon character like the iconic Freddie of Mailchimp.

Delight has its limitations though. It’s not delightful to make light of something serious. Whimsical details can seem infantilizing rather than fun. Clever can come at the cost of clarity. Delight is fleeting, idiosyncratic, and personal. As much as we try to design for delight, it’s not really up to us. People create the meaning.

A New Emotionally Intelligent Design

Technologies that are truly essential to our well-being are technologies we engage with emotionally. While a digital assistant can tell you the weather or an app can remember music you like or a cute error message can make you smile, they don’t react to changes in your mood.

Right know apps and chatbots don’t know whether you are having a good day or a bad day, whether are running up against a big deadline or are about to leave for a long weekend, whether you spent the last hour laughing with close friends or arguing with your spouse. They might be engaging or even delightful, but they aren’t empathetic.

 

Screen shot of the Crystal interface, showing a personality profile from public data - next generation UX design

Crystal gives a hint of the next wave of emotionally intelligent apps, creating a personality profile from public data that supports communication

 

Will emotion-sensing technology lead to emotionally intelligent design? Probably not at first. Feelings are complicated, and hopeless entangled with identity and experience and context. Not to mention emotions mixed up with sensory perception and cognition and behaviors. Recognizing five canonical facial expressions, even if people did start to convey more emotion in their facial expressions when gazing on their phone or looking their refrigerator, are not going to magically make us more sensitive. But emotion-sensing technology may inspire us to reconsider emotional design.

---

Pamela Pavliscak is a keynote speaker at Interact London 2016 - join Pamela and 200 of the best in UX on the 18th and 19th of October. Grab your ticket today >>

Add a comment

Fields marked with an asterisk (*) are mandatory.