Recent crises have shown us that a lot of what goes on in health is hidden from us, and that we need to know a lot more about things that come to light too late. There was, of course, the little-noticed spread of AIDS in the 1980s and of COVID-19 in December 2019 and early 2020. We also have more and more revelations of disparities between races and genders in treatment, a problem that long went unidentified. And a few determined researchers are trying to find out the impacts of the health care system’s inadequate attention to LGBTQ+ people.
I recently talked with Christopher Malter, CEO of Avalon.ai, a company applying applied AI to healthcare. He explained some of the interesting ways they are applying data to serious public health problems.
Campus COVID-19 contact tracing
Contact tracing is a critical component of strategies to control infectious diseases. Modern apps make contact tracing faster and more effective. But COVID-19 presents unusual challenges to contact tracing. Because people can spread the virus while being totally asymptomatic, it can fly under the contact tracing radar and become widespread. And a disease that is both hidden and widespread is harder to control through contact tracing.
Avalon.ai has found an environment especially appropriate for COVID-19 contact tracing: college campuses. They are geographically constrained, and to some extent constrained in the people located there. Campuses are certainly not sealed off: obviously, students go outside for shopping and entertainment (as well as visits home), and people come onto campus for deliveries, teaching, and other things. But the location and its inhabitants are still more isolated than average urban areas.
The Avalon.ai app integrates with other sources of information in the university. The app can draw on information about who’s in classes, any social events that are permitted by the university, who’s in each dorm, etc. Analytics that combine contract tracing with this other information can help the university predict and prevent outbreaks.
The other application for Avalon.ai that I’ll discuss is research toward better standards for prescribing pain-killers. This can catch anomalies in prescribing–whether caused by ignorance or fraud–and eventually lead to revised standards.
The recommended doses for pain-killers vary according to several factors, including the type of procedure. A knee operation is expected to need a different type of quantity of opioids than a rotator cuff operation, and the duration of that pain treatment may also vary. But many recommendations, such as those published by the Centers for Disease Control, were developed decades ago based on initial clinical studies. A wealth of data has been generated over the years that can help adjust dosages.
We all know the dangers of overprescribing opioids: the epidemic of addiction rivals that of COVID-19. But underprescribing is also harmful: it not only leave the patient to suffer unnecessarily, but can hold back recover from the procedure and the patient’s return to full functioning. Moreover, there’s a racial disparity: African-Americans tend not to receive enough pain relief medication.
Avalon.ai contracted with the Centers for Disease Control and Prevention to collect data at a national level for its analytics. They are currently concentrating on orthopedics, which is in itself a huge discipline with many sub-specialties and a lot of patients. One interesting characteristic of orthopedics–as opposed to oncology and some other disciplines that make use of pain-killers–is that a lot of orthopedic patients are teens and young people, thanks to sports injuries. As we know from countless heart-breaking tales, these sports injuries have led too many young people into addiction to over-prescribed opioids.
Avalon.ai collects anonymized information on cases and opioid prescriptions from hospitals, clinics, and individual doctors. This allows them to identify anomalies and suggest changes in doctor behavior. The national scope gives the company much more data and therefore more accurate results than even a large hospital system could achieve on its own.
The company then runs analytics to compare this data to what the public health authorities know about opioid addiction. To get this data, Avalon.ai collects data from law enforcement, treatment centers, and other institutions that come in contact with opioid users. From these comparisons, Avalon.ai hopes to identify over-prescription in the standards and then to recommend changes in the standards.
I see the two applications described in this article as the face of future public health. Data has been critical to this field since John Snow’s famous identification of the source of cholera in nineteenth-century London. With careful anonymization and bias detection, our current tools give us orders of magnitude more insight.