Empathetic technology: can devices know what you’re feeling?

For some, the word “technology” might evoke cold imagery of steely robots and complex computer algorithms. But a talk on “empathetic technology” at this year’s Wired Health conference did a lot to change this perception.

Our smart devices may soon know how we are feeling even before we do. Medical News Today

By Ana Sandoiu, Medical News Today 13 April 2019
Fact checked by Carolyn Robertson

With approximately 39 million people in the United States currently owning a smart speaker, technology that caters to our needs is more and more ubiquitous, taking up ever more of our personal space.

But smart devices can do so much more than merely playing our favorite song or searching the internet when we ask them to. Smart speakers may soon be able to diagnose us or tell how we are feeling.

At Wired Health — an annual conference highlighting the latest developments in health tech — neuroscientist and technologist Poppy Crum PhD gave a talk aptly titled Technology that knows what you’re feeling.

Treading a fine line between ominous and hopeful, the title made a powerful point: soon, consumer technology may know our mental and physical states before we do.

But how, exactly, can technology achieve this? How can we harness its potential to help us elucidate mental and physical conditions, and what role does empathy play in all of this?

These are some of the questions that Crum answered at Wired Health — an event which this year took place at the Francis Crick Institute in London, United Kingdom.

Poppy Crum: Devices Will Know More About Our Well-Being Than Doctors. Forbes
What is empathetic technology?

Crum, who is the chief scientist at Dolby Laboratories in San Francisco, CA, and an adjunct professor at Stanford University in the Center for Computer Research in Music and Acoustics, defines empathetic technology as “technology that is using our internal state to decide how it will respond and make decisions.”

So how can technology read our internal states? Crum’s talk at Wired Health featured some interesting examples of neurophysiological “giveaways” that the right type of technology can now pick up easily — a phenomenon the scientist referred to as “the end of the poker face.”

For instance, as Crum showed in her talk, when we’re feeling overwhelmed by a cognitive load — or, in simpler terms, when we’re struggling to understand something — our pupils dilate.

The pupillometry research from the last few decades has shown that we can track multiple cognitive processes, such as memory, attention, or mental load, by examining the behavior and measuring the diameter of our pupils.

In fact, this is an experiment we can all “try at home.” In 1973, renowned psychologist Daniel Kahneman wrote:

“Face a mirror, look at your eyes and invent a mathematical problem, such as 81 times 17. Try to solve the problem and watch your pupil at the same time, a rather difficult exercise in divided attention. After a few attempts, almost everyone is able to observe the pupillary dilation that accompanies mental effort.”

Further experiments have shown how skin conductance, also known as galvanic skin response, can be a tool to predict a person’s emotional response when watching a movie or a football match.

How much sweat a person’s skin secretes, as well as the changes in the electrical resistance of the skin, can predict “…stress, excitement, engagement, frustration, and anger.”

Furthermore, humans exhale chemicals, such as carbon dioxide and isoprene, when they feel lonely or scared. In fact, in the TED talk below, Crum had tracked the carbon dioxide that members of the audience exhaled when they watched suspenseful scenes from a thriller movie.

Technology that knows what you’re feeling | Poppy Crum. What happens when technology knows more about us than we do? Poppy Crum studies how we express emotions — and she suggests the end of the poker face is near, as new tech makes it easy to see the signals that give away how we’re feeling.
In a talk and demo, she shows how “empathetic technology” can read physical signals like body temperature and the chemical composition of our breath to inform on our emotional state. For better or for worse. “If we recognize the power of becoming technological empaths, we get this opportunity where technology can help us bridge the emotional and cognitive divide,” Crum says. TED. Youtube Jul 10, 2018

Although scientists have known about these processes for a while, Crum noted in her Wired Health talk, the devices that researchers now use in their labs to detect these changes are 10 times cheaper than they were decades ago. Also, smart glasses can now detect such changes, as can cameras from very far away.

Practical applications of empathetic tech

“Empathetic” hearing aids could be personalized and attuned to the amount of effort that a person with hearing problems needs to use in order to make out what someone is saying, said Crum in her Wired Health talk.

This would help destigmatize those living with certain disabilities, as well as providing these people with optimal care.

Empathetic technology also has wide implications for our mental wellbeing. “With more capable cameras, microphones, thermal imaging, and exhalant measuring devices, we can capture prolific data,” writes Crum, data that can, in turn, function to alert carers.

On the subject of mental health, it is not only the eyes that offer a window into someone’s “soul,” but also the voice, Crum expounded in her Wired Health talk.

Researchers have applied artificial intelligence (AI) to data they gathered on parameters such as syntactic patterns, pitch-reflex, and use of pronouns to accurately detect the onset of depression, schizophrenia, or Alzheimer’s disease.

For example, less than a year ago, Tuka Alhanai, a researcher at the Computer Science and Artificial Intelligence Laboratory (CSAIL) at the Massachusetts Institute of Technology in Cambridge, MA, led scientists who designed a neural network model that accurately predicted depression by analyzing speech patterns in 142 participants.

“The model sees sequences of words or speaking style, and determines that these patterns are more likely to be seen in people who are depressed or not depressed… Then, if it sees the same sequences in new subjects, it can predict if they’re depressed too.” – Tuka Alhanai

Study co-author James Glass, a senior research scientist in CSAIL, also commented on the findings at the time. “Every patient will talk differently,” he said, “and if the model sees changes, maybe it will be a flag to the doctors… This is a step forward in seeing if we can do something assistive to help clinicians.”

Other researchers have used computer algorithms to study half-a-million Facebook status updates to detect “depression-associated language markers,” such as emotive cues or greater use of first-person pronouns, like “I” or “me.”

Arthritis gloves and inclusive design

Of course, empathetic technology can enhance not only our understanding of psychological conditions but also that of physical ones.

An experiment that Crum and her team conducted used arthritis simulation gloves to create an empathetic experience for a group of participants. The researchers then asked these participants to design the menu of an app, bearing in mind that its users would have arthritis.

The participants in the arthritis simulation group designed a wholly different user experience from those in the group who could not empathize with their users. People in the former group eliminated features like drop-down menus, for instance, which are hard to engage with for those with digit mobility problems.

The gloves were the result of 10 years of research into “inclusive design,” an effort led by John Clarkson, a professor of engineering design at the University of Cambridge, UK, and Roger Coleman, a professor emeritus of inclusive design at London’s Royal College of Art.

In the video below, Sam Waller — a researcher with the Inclusive Design Group at the Cambridge Engineering Design Centre — uses the arthritis gloves to show how difficult an action as simple as opening a pack of post-its can be for those living with the condition.

Bridging The Exclusion Gap. A set of gloves and glasses which simulate common physical limitations, like age-related long-sightedness or arthritis, have been released in the hope of getting more designers to think again about the usability of their products.
Researchers at the University of Cambridge’s Engineering Design Centre say that millions of people around the country — in particular the ageing, baby-boomer generation – have unnecessary difficulty using everyday products ranging from gadgets, to packaging, to windows and doors, because of poor design. Addressing these issues would also reduce the costs of social care. Cambridge University. Youtube Jun 26, 2013

Waller also uses a pair of glasses to simulate vision problems, and other researchers have used immersive technology, such as virtual reality simulators, to recreate the experience of living with “age-related macular degeneration, glaucoma, protanopia, and diabetic retinopathy.”

Towards an ‘era of the empath’

We are moving towards “the era of the empath,” as Poppy Crum has dubbed it — an era where “technology will know more about us than we do,” but also an era where we will know more about each other than ever before.

“Consumer technology will know more about our mental and physical wellness than many clinical visits.” – Poppy Crum

Combining machine learning with sensing technology and the vast amounts of data it can gather offers great opportunities for physicians, writes the scientist. “Here are just a few other examples of how this might play out,” she notes.

“By combining drug regimens with empathetic technology, doctors gain a closed feedback loop of data from the patient, changing drugs and therapies based on your signals.”

“Or, weeks before you go in for knee surgery, your orthopedic surgeon can gather much more data about your gait and how you use your knees in ways that may benefit from different considerations during your physical therapy rehabilitation post-surgery,” she continues.

At Wired Health, Crum seemed to have convinced her audience that artificial technology, coupled with AI, can drastically improve our lives, rather than hinder them — a point the scientist drives home in many of her previous articles.

” is often feared because people think it will replace who we are. With empathetic technology, AI can make us better, not replace us. It can also assure us and our doctors that the interventions they prescribe are actually solving the problems we have.” – Poppy Crum

Source Medical News Today

References

Pupil dilation as an index of effort in cognitive control tasks: A review, van der Wel P, van Steenbergen H. Psychon Bull Rev. 2018 Dec;25(6):2005-2015. doi: 10.3758/s13423-018-1432-y. Review. Full text

Further reading

Empathy is hard work: People choose to avoid empathy because of its cognitive costs, Cameron CD, Hutcherson CA, Ferguson AM, Scheffer JA, Hadjiandreou E, Inzlicht M. J Exp Psychol Gen. 2019 Jun;148(6):962-976. doi: 10.1037/xge0000595. Epub 2019 Apr 18.

Also see
The empathy option: Digging into the science of how and why we choose to be empathetic Medical Xpress
AI That Understands Your Body Language Forbes
Making Electronic Medical Records More Personal Wisconsin Public Radio
The Future Is About Empathy, Not Coding The Medical Futurist

 

Call 403-240-9100
Mobility Menu
   403-240-9100

follow us in feedly