The following is a guest post by Sarah E. Fletcher, BS, BSN, RN-BC, MedSys Group Consultant.
It is generally agreed that bigger is better. When it comes to data, big data can be a challenge as well as a boon for healthcare. As Meaningful Use drives electronic documentation and technologies grow to support it, big data is a reality that has to be managed to be meaningful.
Medical databases are becoming petabytes of data from any number of sources covering every aspect of a patient’s stay. Hospitals can capture every medication, band-aid, or vital sign. Image studies and reports are stored in imaging systems next to scanned documents and EKGs.
Each medication transaction includes drug, dose, and route details, which are sent to the dispensing cabinet. The patient and medication can be scanned at the bedside and documentation added in real time. Each step of the way is logged with a time stamp including provider entry, pharmacist verification, and nurse administration. One dose of medication has dozens of individual datum.
All of this data is captured for each medication dose administered in a hospital, which can be tens of thousands of doses per month. Translate the extent of data captured to every patient transfer, surgery, or bandage, and the scope of the big data becomes clearer.
With the future of Health Information Exchanges (HIEs), hospitals will have access not just to their own patient data, but everyone else’s data as well. Personal health records (PHRs), maintained by the patients themselves, may also lend themselves to big data and provide every mile run, blood pressure or weight measured at home, and each medication taken.
One of the primary challenges with big data is that the clinicians who use the data do not speak the same language as the programmers who design the system and analyze the data. Determining how much data should be displayed in what format should be a partnership between the clinical and the technical teams to ensure the clinical relevance of the data is maximized to improve patient outcomes. Big data is a relatively new event and data analysts able to manage these vast amounts of data are in short supply, especially those who can understand clinical data needs.
Especially challenging is the mapping of data across disparate systems. Much of the data are pooled into backend tables with little to no structure. There are many different nomenclatures and databases used for diagnoses, terminology, and medications. Ensuring that discrete data points pulled from multiple sources match in a meaningful way when the patient data are linked together is a programmatic challenge.
Now that clinicians have the last thousand pulse measurements for a group of patients, what does one do with that? Dashboards are useful for recent patient data, but how quickly it populates is critical for patient care. The rendering of this data requires stable wireless with significant bandwidth, processing power, and storage, all of which come with a cost, especially when privacy regulations must be met.
Likely the biggest challenge of all, and one often overlooked, is the human factor. The average clinician does not know about technology; they know about patients. The computer or barcode scanner is a tool to them just like an IV pump, glucometer, or chemistry analyzer. If it does not work well for them consistently, in a timely and intuitive fashion, they will find ways around the system in order to care for their patients, not caring that it may compromise the data captured in the system.
Most people would point out that the last thousand measurements of anything is overkill for patient care, even if it were graphed to show a visual trend. There are some direct benefits of big data for the average clinician, such as being able to compare every recent vital sign, medication administration, and lab result on the fly. That said, most of the benefit is indirect via health systems and health outcomes improvements.
The traditional paper method of auditing was to pull a certain number of random charts, often a small fraction of one percent of patient visits. This gives an idea of whether certain data elements are being collected consistently, documentation completed, and quality goals met. With big data and proper analytics, the ability exists to audit every single patient chart at any time.
The quality department may have reports and trending graphics to ensure their measures were met, not just for a percentage of a population, but each and every patient visit for as long as the data is stored. This can be done by age, gender, level of care, and even by eye color, if that data is captured and the reports exist to pull it.
Researchers can use this data mining technique to develop new evidence to guide future care. By reviewing the patients with the best outcomes in a particular group, correlations can be drawn, evaluated, and tested based on the data of a million patients. Positive interventions discovered this way today can be turned into evidence-based practice tomorrow.
The sheer scope of big data is its own challenge, but the benefits have the potential to change healthcare in ways that have yet to be considered. Big data comes from technology, but Meaningful Use is not about implementing technology. It is about leveraging technology in a meaningful way to improve the care and outcomes of our patients. This is why managing big data is so critical to the future of healthcare.
MedSys Group Consultant, Sarah E. Fletcher, BS, BSN, RN-BC has worked in technology for over fifteen years. The last seven years have been within the nursing profession, beginning in critical care and transitioning quickly to Nursing Informatics. She is a certified Nurse Informaticist and manages a regular Informatics Certification series for students seeking ANCC certification in Nursing Informatics. Sarah currently works with MedSys Group Consulting supporting a multi-hospital system.