Over the last several years we’ve talked a lot about the role of wearables and remote monitoring devices in changing healthcare delivery, and for good reason.
By monitoring consumers around the clock, wearables can bring real-world in situ data into the clinical setting, while remote monitoring devices can supercharge virtual care by tracking key health metrics like blood glucose levels and weight.
But maybe it’s time to expand our vision. What if machines could track not only traditional medical metrics, but even tell providers how we feel?
According to one researcher, this might be possible sooner than we think, leading to potentially massive changes in healthcare delivery, reports Medical News Today.
According to neuroscientist and technologist Poppy Crum, Ph.D., who spoke at the recent Wired Health annual conference in London, technology is getting better at catching physiological “giveaways” that demonstrate what we’re experiencing. Not only that, these devices are 10 times cheaper than they were decades ago, she said.
Crum, who serves as chief scientist at Dolby Laboratories in San Francisco and as adjunct professor at Stanford University’s Center for Computer Research in Music and Acoustics, noted that one very visible response people have is for our pupils to dilate when we’re struggling to understand something.
Researchers who study “pupillometry” are beginning to be able to track cognitive processes like memory, attention and mental load by looking at behavior and measuring pupil diameter, Crum told the audience.
She also cited experiments demonstrating that skin conductance can potentially predict a person’s emotional response to movies or sporting events, and that the amount of sweat a person secretes, as well as changes in the skin’s electrical resistance, can predict states such as stress, excitement, and anger, MNT reported.
What impact could empathetic devices have on healthcare? The potential applications are endless, of course. Where to even begin?
In her talk, Crum specifically noted that empathetic technologies may be able to offer important supports to mental health treatment simply by analyzing a patient’s voice. (Obviously, there are consent issues here but that’s for another article.)
Apparently, researchers have already used AI to analyze data they’d gathered on syntactic patterns, pitch-reflex and use of pronouns to accurately detect the onset of depression, Alzheimer’s disease and schizophrenia. Perhaps some combination of AI analytics and other data could predict suicidality in a patient undergoing virtual therapy or psychiatric care? Just wow.
Or what about using empathetic tech to refine medication management? Crum envisions a time when by combining data on drug regimens with empathetic technology, doctors would “gain a closed feedback loop of data from the patient, changing drugs and therapies based on your signals,” she told the crowd.
It’s hard to imagine a world in which, say, a robot with onboard empathic technology could do just a good a job of assessing certain aspects of a patient’s condition as a physician would, but it could be possible. Technology can’t establish true rapport with a patient, but it can be a friendly sidekick.
On the other hand, as with AI, scientists like Crum aren’t suggesting that this empathic technology will replace clinicians. However, it seems likely that it could have a huge effect on how care is delivered in some settings, especially when doctors are seeing patients remotely. I don’t know about you, but I’m psyched to see what direction all of this takes.