PDD Emotional AI | Remote Healthcare Systems | Healthcare

Emotional AI : Driving deeper connections for patients
in digital healthcare

By Sarita

on September 10 2022

At PDD we like to keep one eye on the horizon, scanning themes and emerging trends across the healthcare sector to understand what’s next. As the rise of digital health and therapeutics continues, we take a look at what this means for the patient experience and what opportunities lie in the area of self-care and self-management.

In recent years, there has been a significant growth in investment in healthcare-driven AI, in response to the accelerated demands for remote healthcare systems. We are seeing an increase in development of monitoring apps, digital triage systems and home-care devices aimed at providing healthcare support to patients and shifting some aspects of monitoring and treatment away from clinical settings.

Software, in the form of mobile health (mHealth) apps, is more and more often being prescribed by clinicians as a supportive counterpart, and in a few cases, as an alternative to drugs. Yet, while these apps pose many opportunities to enhance the patient experience with real-time feedback and increased healthcare touchpoints from the comfort and convenience of their homes; significant challenges around high patient dropout rates still remain.

Addressing this challenge of retention, mHealth apps have gone some way to increase patient engagement with push notifications, elements of gamification and connection to support groups or healthcare professionals. However, with even more emphasis now on digital healthcare systems to aid self-care and self-management, we ask: How might we facilitate a deeper level of connection for patients with digital healthcare systems to encourage better adherence and lasting, positive behavioural change?

Young person holding hologram projection displaying health related graphs and symbols

Emotion and Function

A starting point perhaps would be to consider the patient in a holistic manner, placing their needs at the centre of solutions to create adapting approaches to healthcare that take into account not just the physiological aspects of a condition, but also the psychological effects that a condition can bring… fear, uncertainty, doubt, anxiety, apprehension, desperation, relief, hope.

While undoubtedly digital health solutions need to focus on functionality, be straightforward to use, and instil confidence in the patient; they can often lack a ‘human’ quality, which is where real connections happen. The emotional burden of health conditions heightens the need for empathy, understanding, patience, pace, contextual awareness, and demands a responsiveness that is personal to an individual’s situation.

Driving more natural and authentic Human-Machine Interactions (HMI), Emotional AI is a branch of Artificial Intelligence (AI) that can recognise, interpret, and respond to more emotive channels of human communication. It uses AI to detect both verbal and non-verbal signals, picking up on anything from voice inflections to facial expressions, essentially creating a series of behavioural biomarkers.

The use of Emotional AI is starting to emerge across a wide range of industry sectors, from smart home systems to automotive, digital marketing to retail, and even in the financial sector. The aim of all these applications is to create a deeper connection with users through systems that can anticipate needs, and automatically adapt their outputs based on even the most subtle of human behavioural cues, without the need for excessive data input by the user.

Here we look at the most relevant examples to explore the opportunities they might bring for future healthcare applications.

The evolving personality of AI

Olly Emotech
Image ref: Olly home robot with personality, Emotech

AI with a personality that evolves over time is an intriguing concept, perhaps. But that’s exactly what London-based robotics startup Emotech explored through their voice-controlled home assistant concept, Olly. To differentiate it from other similar systems such as Alexa, Siri and Google Home; Emotech proposed using machine learning algorithms that would ‘teach the system to gradually be more like its owner’, allowing the user to train Olly.

The concept could detect and interpret facial expressions, voice inflections and verbal patterns, enabling it to start conversations with the user and make suggestions based on what it has ‘observed’. Other key elements are the understanding of scheduling patterns and contextual awareness, which allows the system to anticipate user needs and respond to them in a more sensitive way.

Since the initial presentation of the concept a few years back, Emotech has instead turned their focus to licensing the system’s software (Olly’s brain) to the educational sector, opening up a whole host of new opportunities.

Opportunities for healthcare:

What if mHealth apps had the ability to become more like the patient by taking an empathetic approach; proving encouragement, support and advice that is personal to the individual’s behavioural and emotional patterns over the course of a treatment regimen.

For instance:

  • Could a system understand the level of discomfort of a patient self-injecting from facial expressions, and suggest an alternative technique for next time?
  • Could a system detect if a patient wasn’t able to keep up with the pace of video instructions for a medical device through their real-time actions, and automatically slow down or pause the playback?
  • Could a system identify the low mood of an eczema sufferer during a flareup through voice inflections or facial expressions, and provide motivation and encouragement for the patient to still attend their scheduled social events?

Responding to user context and condition

"AI AI facial detection system for driver assistance.

As technologies continue to evolve, we are seeing the importance of context awareness in AI, through proactive systems that can detect not only the physical conditioner of the user, but also the environmental conditions and respond accordingly. An industry leading in this area is automotive, which takes a holistic approach to monitoring, by combining biological markers, behavioural biomarkers, and context awareness.

The application of AI in automotive has been evolving at a steady pace over the last decade. Wellbeing, health, and safety continue to be the focal points for these systems; with much earlier applications combining sensing technology from wearables, in-car sensors, and environmental data to initiate driver assistance.

The AI Facial Detection System by Exeros Technologies is an in-vehicle camera system that helps drivers to stay alert by detecting signs of tiredness and distraction while driving. ‘The system continuously scans the driver for signs of fatigue, mobile phone use, smoking and prolonged signs of distraction’, emitting audio alerts to warn drivers of their behaviour and draw attention back to driving. The cleaver system can also send signals to third-party hardware such as GPS tracking devices.

As autonomous vehicles become more mainstream, there will be a greater demand for deeper human-to-machine cooperation. In the not-too-distant future, we may see vehicles with integrated elements of self-driving technology taking control if the driver fails to respond to alerts. Or looking a little further out, seamless switching between driver-control and autonomous mode based on real-time driver and contextual data.

Opportunities for healthcare:

What if digital healthcare applications and devices respond to or anticipate the needs of patients based on a combination of real-time context and state of their condition, creating more tailored and immediate responses.

For instance:

  • Could a system automatically switch been audio and haptic reminders, or alerts based on the ambient noise of the patient’s environment?
  • Could a system offer tailored supportive and motivational advice for patients who suffer from anxiety disorders when out in public spaces?
  • Could a system monitor a patient using a drug delivery device in real-time and detect potential usage error, creating an alert or intervention to prevent harm or misuse?

Capturing and casting human qualities and attributes

Hyper Realistic Digital Avatar - Digital Human Project, The Scan Truck.
Image ref: Hyper Realistic Digital Avatar – Digital Human Project, The Scan Truck.

Drawing back to the all-important area of empathy for this last example, it’s worth taking a moment to consider the future of chatbots within the healthcare context. For many, chatbots can be a challenge at the best of times, often creating annoyance and frustration; and for a patient, it’s hard to get any sense of empathy through computer-generated text-based conversations even with a photo of a person hovering in view. Empathy requires a far deeper sensory approach that relies on subtle verbal and non-verbal cues from both parties, to create a connection and a sense of shared feeling.

Perhaps the notion AI with ‘true empathy’ is still a thing best left for the realms of science fiction, for many it encroaches uncomfortably on what defines us as humans. But an area of potential lies in the capturing and casting of the qualities and attribute of real humans through hyper-realistic avatars, such as those created by mobile 3D scanning company, The Scan Truck. Their Digital Human Project saw them team up with interactive content creators ICVR and actor Jason L White to create a hyper-realistic human experience, drawing on the benefits of 3D scanning photogrammetry to create 3D models of a real person.

A good example of a hyper-realistic avatar application can be seen in the financial sector, where, a few years back, investment bank UBS trialled a service utilising human-digital assistants, one of which consisted of a life-like avatar of their Chief Investment Officer in Switzerland, Daniel Kalt. The service allowed customers to book meetings with the digital version of Kalt, which interacted with customers through voice and eye contact. UBS wanted to test ‘acceptance of digital assistants in a wealth management context’, exploring alternative ways to provide their clients with ‘frictionless access’ to their services.

Over the last few years, companies in other sectors have continued to explore the potential of avatars to expand their service offering, creating ‘digital-doubles’ of real personnel in an attempt to create deeper connections with their customers through digital platforms.

Opportunities for healthcare:

Could mHealth apps featured hyper-realistic Healthcare Professional (HCP) avatars based on real healthcare providers, to help guide patients through their treatment regimens and bridge the gap between physical and digital healthcare experiences.

For instance:

  • Could we assign to patients a HCP-avatar of someone from their real-life healthcare team, to create a more personalised and joined-up experience?
  • Could we use HCP-avatars to enhance digital mental health services and apps to make them less intimidating and more ‘human’?
  • Could we use facial recognition to detect the moods and emotions of patients and elicit an appropriate gestural response from the HCP-avatar to give a sense of real-time understanding?

Greater than the sum of its parts

By taking a wider view of technologies and applications of Emotional AI across sectors, we can learn and leverage the opportunities around behaviour change. By pairing those insights with a deeper understanding of behavioural science, we can also explore numerous opportunities to improve patient adherence in the future of digital healthcare. But ultimately, if the development of healthcare AI is to facilitate a deeper level of connection for patients through digital healthcare systems, a far more holistic approach is needed.

Using Human-Centred Design (HCD) methodology within the development of digital healthcare, allows patient to be put front and centre of innovation. HCD helps to uncovering the unmet needs, desires, and challenges of patients; fuelling the development of solutions that resonate deeply with people and ensuring that digital applications address the wider needs of patients on their treatment journey.

By combining the sensing landscapes (physical metrics, contextual metrics, and emotional metrics), we can start to form a more rounded picture of what is happening to patient not just physiologically but also psychologically; eliciting highly personal responses that balance functional and emotional needs in real time, to create truly patient-centric solutions.

If you would like to hear more about the work we do within the healthcare sector at PDD, you can do so here.

If you would like to learn more or discuss an interesting project with us, get in touch!