Could Big Data Improve Population Health Outcomes? Not Without Interoperability, Experts Say

    May 24, 2016
    avatar

    By Bill Andruss

    Big data promises to transform how healthcare is delivered and evaluated. Before that can happen, however, administrators must ensure that their entire infrastructure is capable of collecting data that can be consolidated and analyzed in a meaningful, valuable way.

    It’s long been anticipated that big data would be a technological boon for healthcare. From improving the health outcomes of patient populations, to eliminating inefficiencies everywhere from treatments to entire hospital systems, to helping providers move to a quality-based model of care, big data analytics has the potential to revolutionize the industry.

    However, healthcare systems have made it a far less perfect match than expected. These benefits require linking big data to population health, but as Harvard Medical School Professor Kenneth Mandl, MD, told Healthcare IT News, “The data doesn’t link very easily.” With a vast range of competing data types, clinical standards, technologies, and privacy issues, sharing data has become more difficult than analyzing it. Without interoperable systems, it will be nearly impossible to generate actionable insight from these gigantic swaths of information.

    The Upsides are Big

    As McKinsey observes, the essential promise of big data to healthcare providers is the ability to aggregate and analyze many discrete clinical and patient databases in order to provide better care. With this data in full view, companies can drive evidence-based treatment options and decrease the costs of care — quality, rather than cost-reduction, will become the goal. For individual patients, providers can ensure that treatments aren’t redundant and be able to determine whether a different approach might yield better results.

    As another Healthcare IT News article notes, professionals are already using machine learning techniques, sophisticated algorithms, natural language, and image processing to identify subtle trends and improve disease diagnostics. Far from being wide-eyed tech buzzwords, these strategies have produced tangible results, such as the ability to identify skin cancer through a picture, or predict disease through genomic analysis.

    Interoperability Is a Barrier

    According to TechTarget, however, clinical research data and medical databases are highly fragmented, and there are very few standards on which providers agree. Additionally, HIPAA regulations and other privacy measures make data-sharing difficult, especially where information can potentially identify an individual patient. While standards such as the FHIR have eased progress, developments are slow-going.

    Providers also encounter the reverse problem: too much data! Oftentimes, companies lack the technological means to process mountains of information, let alone prioritize findings. On the ground level, clinicians or physicians working via an electronic health record (EHRs) interface may receive multiple redundant notifications (or many varying alerts, with no indication of relevancy), or otherwise be forced to sift through reams of information manually.

    At the most basic level of interoperability, information is aggregated, but not necessarily filtered and organized.

    Interoperability Should Begin on a Small Scale

    While most providers appreciate the merits of big data, many don’t see a clear path forward for reorienting or scaling their legacy IT systems to handle the processes that underpin it. The reality is that big data strategies will only be profitable once individual hospitals, practices, and other medical centers gain effective control over their data. Many medical IT systems were initially built to support paper systems — providers will have to about-face if they hope to become data-centric organizations.

    In this sense, interoperability is as much a capacity problem as it is a standards problem. Healthcare companies will need to gain a comprehensive understanding of their forward-looking capacity growth to invest effectively in infrastructure that can support new services.

    Ideally, the efficiencies gleaned from big data will downgrade the issue of cost to secondary status, but for now, the business side of healthcare has to work out supporting big data on paper. However, effective capacity management is entirely realistic for healthcare companies using the right tools. Hopefully, a higher quality of care will soon be a reality, too.

    (Image credit: Nicoli Berntsen/Unsplash)