How RPM Can Reduce AI’s Bias Problem & Improve Health Equity

The following is a guest article by Arnaud Rosier, PhD, Founder and CEO at Implicity

Artificial intelligence (AI) is one of the most promising breakthrough technologies of the modern healthcare era, yet it also has the potential to be one of the most dangerous. AI algorithms that are trained on limited or poorly representative data sets can exhibit signs of bias in their results, skewing decision-making and possibly leading to ethnic, gender, and social discrimination and other unintentional consequences for the patients they serve.

Unfortunately, research shows that bias is already creeping into the nascent field of AI and machine learning.  In 2019, one study found that a widely used algorithm was underrepresenting the illness burden of black patients compared to white patients, meaning that Black individuals had to be much sicker to get a recommendation for the same level of care as their white counterparts. It was also well documented that Watson, IBM medical AI, was affected in many cases by bias, recommending therapies not accessible to the population using the software.

Concerns over bias create distrust in AI and often keep healthcare leaders from fully embracing the technology. It is imperative that we address rising risks of AI bias before the ecosystem becomes even more established. We must find better ways of connecting with more diverse and representative patients to ensure trust by ensuring algorithms are trained with large and diverse datasets. 

Remote patient monitoring (RPM) can be one key to achieving this goal. By reaching more patients in different geographies, and reducing barriers to patient access, RPM can help build trust in AI and improve health equity by broadening the diversity of datasets used to train AI algorithms. Progress is already being made in the area of remote cardiac monitoring. 

Unbiased Prediction of Heart Failure

The first step to reducing bias in AI tools is to increase the diversity and representation of data. Given the growing use of cardiac remote monitoring, an increasing volume of patient data is being gathered from connected devices.

Moreover, in 2019, as part of its national strategy on AI, the French government created the Health Data Hub. The platform combines all nationwide sources of data, including all resource utilization such as hospitalization and follow-up, but also medications and causes of death. Since France is a very centralized single-payer system, this data is gathered from across the country. The database was made available to selected organizations, but Implicity was the only cardiac remote monitoring platform to gain access to. Implicity is now using the nationwide database to develop research and algorithms with better performance and less bias.

The Health Data Hub provides access to anonymized patient health information from more than 3.7M people. Implicity has combined this data with data collected from remote cardiac monitoring devices, creating a unique dataset that is the foundation for developing an innovative algorithm that can reliably predict acute heart failure episodes in patients with cardiac implant monitoring. Because of the robust data sets being utilized, this algorithm can potentially eliminate or drastically reduce bias and improve health equity.

Benefits Beyond the Algorithm 

Aside from eliminating bias in AI, RPM is also changing how clinical research is performed by broadening patient access to studies. For example, equipping cardiac patients with RPM devices in their homes can reduce the necessity to come into the clinic for routine checks for things like blood pressure, weight, cardiac rhythms, or blood sugar.  This could make participation in research more viable and attractive for more diverse patient groups, including those with limited access to centralized trial sites. 

Today, research is often conducted in urban areas at large academic medical centers (AMC), which can be hard to reach for rural populations and those facing other transportation barriers. Trials demand regular attendance at frequent appointments, which can be problematic for people who cannot afford time off work, the expenses of childcare, or the risks of leaving other family members at home without a caregiver.

As a result, only the patients who have adequate time, money, and social support are able to participate in research or contribute their data to AI tools and similar projects. These patients tend to be less likely to have significant burdens of chronic disease, are more likely to have higher health literacy rates – and due to the nature of systemic oppression in the United States, are more likely to be white than members of other racial and ethnic groups.

We know that the same therapy can act differently in people of diverse genetic backgrounds. And we know that socio economic burdens can significantly affect a patient’s ability to access and adhere to recommended care. But we are not doing enough to extend the healthcare system to places where underserved populations live, work, and play. By digitizing home health-related data from the source, RPM contributes to less selection bias in research.

Creating a More Equitable Future 

RPM also offers the advantage of continuous data collection in many cases, giving researchers a much richer and more accurate picture of a person’s health unaffected by “white coat syndrome,” which can alter certain readings. Real-world data that is collected as part of everyday life is extremely valuable for identifying the efficacy and safety of new therapies and devices. 

Developing a strong feedback loop between RPM and AI to support continuous improvement is especially important since many RPM devices rely on AI algorithms to perform their basic functions to begin with. Ensuring that developers are learning from the experiences of actual patients using their devices outside of tightly controlled research settings can help to identify hidden biases and course correct before any issues arise.

As AI becomes more sophisticated, we must invest in patient recruitment strategies and data governance guardrails that prioritize equity and take advantage of RPM and other technologies to reduce barriers to accessing representative data.

Studies and algorithm development projects should include perspectives from diverse points of view in the design phase, including clinicians and patient participants with varying backgrounds.  Institutions sponsoring research projects, or companies developing algorithms, should establish minimums for diversity and inclusion in their training data sets to ensure algorithms start off on the right foot.  

Meanwhile, researchers should explore the potential role of RPM devices in these initiatives to make projects more accessible to traditionally underserved patients and provide detailed training and coaching for the patients who will be using these tools in the home setting. And algorithms available in the market should be continually evaluated for their accuracy, applicability, and equity among real-world groups, including gender, racial, ethnic, and age-related categories of users.

By integrating RPM devices into clinical trials and the AI research and development process, the healthcare industry can avoid unintentional bias, support greater health equity, and give more patients the chance to achieve better outcomes with the help of cutting-edge technologies. 

About Dr. Arnaud Rosier

Dr. Arnaud Rosier is a cardiac electrophysiologist with a PhD in symbolic artificial intelligence. Dr. Arnaud Rosier is the Founder and CEO at Implicity, a clinical algorithm company in 2016 to help HCPs optimize remote cardiac monitoring and improve their patient outcomes. With 20 years of experience in cardiac electrophysiology and 15 years in artificial intelligence and knowledge engineering applied to health, Arnaud is the author of a dozen international publications in peer-reviewed journals, in the field of cardiology and AI. Arnaud is also an angel investor of digital health companies including Cardiologs, Lifen, Prove Labs, LifePlus, Pixacare, Qynapse, and Biloba.

   

Categories