Can the health app paradox be resolved using emotional AI?

The health app paradox is that sometimes apps meant to improve our health can actually do more harm than other apps. Can the recent advancements in Affective Computing solve this paradox?

9 min read

2 days ago

skull surrounded by silhouettes of vegetables and fruits and the inscription ‘You look good!’
Illustration by Maria Kovalevich

Apps designed to help maintain your mental and physical health aim to transform you into a better version of yourself, don’t they?

But have you ever found yourself annoyed by untimely and overly optimistic notifications, such as “You’re doing great, fantastic progress today!” or “Keep up the pace”? Especially when you receive them during moments of sadness or when you’re feeling down.

Or perhaps you’ve experienced a sense of guilt for simply marking a missed workout, just to avoid receiving well-intentioned but somewhat stern comments like “Consistency is the key to success, don’t give up!” or “I still believe in you!”.

All these things can intensify your frustration, introduce unnecessary stress into your life, and even lead to depression and various forms of disorders.

This issue is discussed in more detail in this BBC article.

a watering can in the air, watering a flower surrounded by clouds and the sun
Illustration by Maria Kovalevich

As a result, this issue has gained the attention of the public and has resonated within society. Scientists have studied this extensively, and even the UK government has tried to tackle it through laws.

Afterward, developers of health-related apps began making changes to their product designs. For example, Under Armour, the company behind the MyFitnessPal app, has recognized this concern and has taken steps to address the potential misuse of the app. They have put in place “specific safeguards to reduce its attraction for individuals attempting to use it to enable detrimental eating behaviors” as detailed in their response to Refinery29.

However, are these measures sufficient? Experts generally agree that simply adding warnings about the necessity of consulting a doctor before using the app may not be adequate.

What else can be done apart from adding these warnings or other elements to the app’s design?

I believe the solution lies in utilizing the latest advancements in emotional AI, also known as Affective Computing.

the picture shows a silhouette of a human head with an emoji with different moods inside; happy, sad, surprised, angry and doubting
Illustration by Maria Kovalevich

But what can AI offer to address users’ psychological issues with health-related apps? Firstly, a potential trigger for the emerging issue could be an increase in stress levels during interactions with the app.

So, hold on! If even the closest people sometimes can’t determine the sincerity of your “I’m fine!”, how can a soulless machine handle this, you might say?

To answer this question, let’s figure out which stress indicators, on an emotional, biological, and cognitive level, we can currently measure.

To avoid overwhelming you, I’ll list the top 10:

  1. Level of stress hormones (cortisol and adrenaline) in the blood.
  2. Heart rate (pulse) and blood pressure.
  3. Rate and frequency of respiration.
  4. Electrocardiogram (ECG) for analyzing heart activity.
  5. Changes in subjective self-reports (questionnaires and surveys).
  6. Analysis of blood biomarkers associated with inflammation, such as cytokines.
  7. Electroencephalogram (EEG) for studying brain activity.
  8. Blood glucose level.
  9. Skin temperature and changes in skin color (peripheral vascular response).
  10. Electrodermal activity (measuring skin’s sweat response).

It’s interesting that just a few of these markers can expose even the best poker face through a regular camera. It’s possible to read emotional states by analyzing changes in breathing, heart rate, and slight facial color changes (invisible to the naked eye).

You can learn more about this in Rosalind Picard’s interview on Lex Fridman’s podcast at the 30-minute mark.

I hope I’ve convinced you that sometimes machines can understand us better than our closest people.

But let’s return to our problem and try to answer the question: how can we make use of these technologies to minimize all the harm that health apps can inflict?

a collage of medical devices
Illustration by Maria Kovalevich

Let me make this clear. To accurately measure all the necessary biomarkers, our regular smartphones fall short in capabilities. At least for now. Therefore, for stress control, it’s necessary to use an additional device that will collect the data.

The recent COVID-19 pandemic has seen a surge in depression and suicides, sparking significant interest within the scientific community to explore and address stress-related research.

Psychologists, medical professionals, and neurobiologists are actively developing diagnostic and management methods for stress, along with strategies to bolster mental health during crises.

As a result, we now have access to a diverse range of stress-measuring technologies.

Unfortunately, most of these technologies only work in laboratory or specially created conditions. They are designed for use by scientists in specific research studies.

And, to be frank, many of these devices don’t look very user-friendly. It’s hard to imagine handing a device with a bunch of wires to an average user and saying, “Hey, buddy, wear these wires all day.”

Photograph of a device for monitoring stress levels. Wires and sensors.
Image credits: MINDFIELD eSense

Just the look of this device can cause not only stress but also real panic. Right?

But what options do we have as common creators of digital products?

Let’s start by outlining what we are looking for.

I’ll stick to the tradition of listing ten points:

  1. Convenience and comfort in use: The device for measuring biomarkers should be comfortable for 24/7 use.
  2. Accessibility and price: The technology should be accessible to a wide audience and reasonably priced.
  3. 24-hour monitoring: The sensor should continuously monitor biomarkers around the clock.
  4. Integration and compatibility: The technology should seamlessly integrate with our applications.
  5. Privacy and security: Ensuring user data protection and guaranteeing its confidentiality are essential.
  6. Personalization: Technology should adapt to individual user characteristics and needs for more accurate analysis.
  7. Adaptation to real-life conditions: The technology should be adapted for measurement in real-life situations.
  8. Analytics and interpretation: The system should provide analytics and interpretation of results so that users can better understand their emotional state.
  9. Diverse data sources: Technology should use a variety of physiological and psychological parameters for a more complete analysis of stress.
  10. Electrodermal activity (EDA) sensor: Having an Electrodermal Activity (EDA) sensor in the device is the key indicator of the most precise measurement of stress levels.

A small comment regarding point 10. During my research, I discovered that one of the factors that determines the accuracy of stress level measurements is the presence of an EDA sensor in the device. As an example, let’s consider this study, which states that “The result showed that EDA could classify the stress level with over 94% accuracy. This system could help people monitor their mental health during overworking, leading to anxiety and depression because of untreated stress.”

For simplicity, I will provide a brief description of the device, its main pros, and cons.

close-up of a hand with a watch
Image credits: Empatica

EmbracePlus. Scientists consider this the gold standard for monitoring physiological data in the market. It was created by Empatica in partnership with Professor Rosalind Picard, who is a pioneer in Affective Computing. The price is currently unknown, and the product is not available for purchase yet. However, it’s expected to start at around 1500 euros based on the pricing of the previous model (Embrace 4).

Pros: The smallest and most accurate wearable device to date, combining PPG, accelerometer, gyroscope, temperature, and EDA sensors.

Cons: Currently, the device’s main downside is its unavailability for purchase.

close-up of a hand with a watch
Image credits: Nowatch

Nowatch. Starting at €447. The device is equipped with Philips EDA Biosensing Technology, which monitors changes in sweat gland activity by measuring skin conductance.

Pros: In addition to its sleek design, users can access real-time data through the iOS or Android app, which provides insights and tips for a balanced lifestyle. The watch includes multiple sensors, such as PPG, EDA, accelerometer, temperature, and barometer.

Cons: It lacks integration capabilities with third-party applications. In fairness, it’s worth mentioning that integration with the Health app is planned for the near future, but that’s all for now.

a finger with a pair of finger plugged into a device
Image credits: MINDFIELD eSense

Mindfield eSense Skin Response. €169. It’s a compact sensor that utilizes your smartphone or tablet’s microphone input (compatible with Android and Apple iOS devices) to measure skin conductance.

Pros: In terms of pricing, the Mindfield eSense Skin Response generally offers a more budget-friendly option when compared to some other stress-monitoring devices on the market.

Cons: The biggest downside is its uncomfortable design, with numerous wires that make it impractical for 24/7 wear.

Well, it’s clear that we don’t have any winners here, as each of the three devices has critical drawbacks that prevent any of them from being used to enhance the level of empathy in digital products at the moment.

But let’s imagine that EmbracePlus is already available on the market, or that NoWatch can now be integrated with third-party applications, or perhaps an even better device has appeared, surpassing the ones we already know.

How could we use them in that case? I can offer you a few hypotheses.

List of improvements for the previously mentioned MyFitnessPal

  • Personalized recommendations: By tracking a user’s stress levels, these apps can tailor their recommendations. For example, during periods of high stress, they can suggest methods to relieve tension rather than focusing on calorie intake.
  • Emotional eating awareness: Stress often triggers emotional overeating. Apps can prompt in real-time or suggest alternative activities to deal with stress, which reduces the likelihood of emotional overeating.
  • Recommendations during illness (I’m missing these options in all the health apps I’m currently using): MyFitnessPal can tailor workout recommendations, monitor hydration and nutrition, track symptoms, and monitor progress to support the user’s health and recovery during illness.
  • The most important thing, in my opinion, is the identification of critical states: By analyzing emotional data, the app can identify signs of more serious psychological discomfort, such as depression or anxiety, and offer timely help, including referrals to specialists.

Well, I hope this will be enough to help fuel your imagination and come up with a few more ideas on how to improve the health app you personally use.

I’d be delighted if you share your ideas in the comments to this post.

So, what answer can I provide to the question I posed at the beginning of the article — “Can the health app paradox be resolved using emotional AI?”.

And my answer is — definitely can!

As we have seen, artificial intelligence has the capability to sensitively respond to changes in our emotional state, including monitoring stress levels during interactions with various applications. This opens the door to a more empathetic and user-centered approach to health and well-being.

But we definitely need at least a couple more years before we have a chance to bridge the gap between advanced sensor technology and convenient, affordable devices.

Perhaps scientists will still be able to fit all the sensors they need into, say, a small and elegant ring?

For example, I wear an Oura ring all the time, which gives me almost no discomfort (except for a couple of asanas in yoga, where you have to cross your fingers). But, unfortunately, Oura is not yet able to track my emotional state. Plus the health recommendations and analytics are still very generalised and unspecific.

But I believe that anything is possible! Particularly considering the rapid pace of AI development.

Can you realize that the release of ChatGPT was only six months ago? Yeah, I can’t either.

So keep up to date with the latest news in empathy development for digital products by staying tuned!

Published
Categorized as UX Tagged