While everybody talks about the potential for so-called “big data” in healthcare, there seems to be more smoke than fire at this point. To date, health payers have been a lot more engaged with using big data than providers, according to IDC Health Insights.
That being said, there are some providers out there who have been able to get their arms around big data projects which improve care, FierceHealthIT reports.
One example is the University of North Carolina Health Care (UNCHC), a health system based in Chapel Hill, N.C., where they’ve begun programs to leverage big data in improving the quality of care and reporting, according to FierceHealthIT.
As the UNCHC system has grown, it’s seen a dramatic increase in the amount of data each facility was holding — and making things even more challenging, 80 percent of the data was unstructured, according to Carlton Moore, MD, associate professor of medicine at UNCHC.
As Dr. Moore notes, it’s difficult to use unstructured data to meet accountable care objectives. For example, when patients get cancer screenings at another institution, physicians write that in the unstructured notes, but don’t check off that they’ve had the study because it wasn’t done there.
But UNCHC has taken on the mass of data under its roof. It’s developed a unique algorithm inserted into a natural language processing plan which allows researchers to find and address abnormal results on pap smears and mammography screenings.
While this is just a beginning, UNCHC has bigger plans. It intends to take next steps in analyzing and using its mass of data such as analyzing medication compliance and determining the number of clinic visits associated with bad health outcomes.
Kudos to UNHCH on their progress. But I don’t expect to see a ton of these projects showing up in the public arena; there’s just too much involved, particularly with ICD-10 and Meaningful Use draining resources like crazy.
In the mean time, though, providers may want to embrace “skinny” healthcare data, argues my colleague John Lynn. The concept: instead of creating a huge enterprise data warehouse for all purposes, why not focus on smaller problems? That might be a faster path to results, and a decent preparation for the big data future, no?