The modern enterprise imaging and data value chain

The modern enterprise imaging and data value chain
In the past 20 years, the digital transformation of the health care industry has been rapid and extensive. Digitalization has created a new problem: information overload. According to one estimate, the volume of health care-related data being generated digitally doubles every 73 days. Many of the data is stored in separate silos, such as digital imaging and communications (DICOM) images, reports and multimedia data. This makes cross-system access difficult. Moreover, powerful diagnostic tools are often not interoperable. This means that instead of supporting informed and effective decision-making, the technological revolution often hinders better diagnosis or patient care .

Productivity improvements have helped a wide range of industries–except the health care industry. From 1999 to 2014, productivity in the health care sector increased by just 8%, whereas other industries achieved far greater efficiency gains of 18%. Although productivity comparisons between industries are often inaccurate, they do show that healthcare lags behind other industries in terms productivity and potential . growth.

To operationally increase productivity in health care, there are two things that must happen. First, data must first be understood as a strategic asset. Data must be used to create intelligent workflow solutions that encompass all aspects of imaging.

Second. To be able to talk about a value chain, all fields of competencies must be connected. The connection must be as seamless as possible, open and secure as possible. It is important that all relevant data be available for patients, medical professionals, and researchers.

A modern enterprise imaging solution must prioritise outcome optimization, better diagnostics, and improved collaboration.

Health care today: gaps, bottlenecks, silos

The costs and consequences of the current fragmented state of health care data are far-reaching: operational inefficiencies and unnecessary duplication, treatment errors, and missed opportunities for basic research. Recent medical literature is full of examples of missed opportunities and patients who are at risk due to a lack in data sharing.

Every year, more than four million Medicare patients are transferred to skilled nursing facilities (SNFs). Many of these patients are elderly with complex conditions and can find the transition difficult. According to a 2019 study published in the American Journal of Managed Care, one of the main reasons patients fare poorly during this transition is a lack of health data sharing–including missing, delayed, or difficult-to-use information–between hospitals and SNFs. “Weak transitional care practices between hospitals and SNFs compromise quality and safety outcomes for this population,” researchers noted.

Sharing data is a problem even within hospitals. A 2019 American Hospital Association study published in the journal Healthcare analyzed interoperability functions that are part of the Promoting Interoperability program, administered by the U.S. Centers for Medicare & Medicaid Services (CMS) and adopted by qualifying U.S. hospitals. The study showed that among 2,781 non-federal, acute-care hospitals, only 16.7% had adopted all six core functionalities required to meet the program’s Stage 3 certified electronic health record technology (CEHRT) objectives. Data interoperability is not an option in health care.

Data silos and incompatible sets of data are another obstacle. In a 2019 article in the journal JCO Clinical Cancer Informatics, researchers analyzed data from the Cancer Imaging Archive (TCIA), looking specifically at nine lung and brain research data sets containing 659 data fields in order to understand what would be required to harmonize data for cross-study access. The effort took more than 329 hours over six months, simply to identify 41 overlapping data fields in three or more files, and to harmonize 31 of them.

As researchers wrote in an August 2019 article in Nature Digital Medicine, “[i]n the 21st century, the age of big data and artificial intelligence, each health care organization has built its own data infrastructure to support its own needs, typically involving on-premises computing and storage. Data is balkanized along organizational boundaries, severely constraining the ability to provide services to patients across a care continuum within one organization or across organizations.”

Focus on outcomes, and be smart

What can be done to bridge these gaps? The challenge has been taken up by technology innovators and IT specialists in the health care sector. Many important developments are being made. Along the way, several key best practices have emerged, including:

  • Ensure appropriateness via connectivity for patient pathways and a smart imaging value chain: Historically, digital imaging and other data have been siloed within a particular department–radiology, cardiology, orthopedics or oncology, for example. The future will see a patient’s data follow them across all health care encounters and all specialties using an open patient data model. This includes new digital transformations in clinical departments like pathology.
  • Manage the data load to support diagnostic outcomes: Too often clinicians must shift through reams of irrelevant data to find the key information they need to meet actionable decisions. Decision support software is an effective tool to identify and highlight important diagnostic findings. It delivers the data doctors need at the touch of a button. It has been proven that integrated state of the art AI technologies are key enablers for workflow automation and clinical efficiency improvements.
  • Secure access and connectivity to different interoperability standards : To address the continuing challenge of interoperability and ensure that all the parts of the system use the same syntax and speak the same language, central core software modules will be used to translate data coming in from a variety of sources, including third party vendor offerings. These software modules will become more flexible and be able to connect more medical specialties, as well as incorporating more data and functionality. Standardization improves security by providing a common technological base. Standardization also allows for continuous improvements and updates in all areas within the shortest time possible.
  • Connect care systems and drive collaborative outcomes : The goal is to provide, together in one comprehensive interface, the data that physicians need, the tools they need to analyze the available data, and to require an actionable decision. This principle makes everyone from all specialties equal participants in patient care. All members of the care team can access the same interface and participate in the delivery of care. It doesn’t matter where the data and devices are accessed. Remote solutions will be an important building block.

What’s next?

We all feel the need for change in health care. It is not a question of whether something should change. It is about how we shape this process. Software, as in many other industries, is the central building block on which health care will be based in the 21st century. The strategy of the health care provider is key to getting health care services on track.

Learn how Syngo Carbon and Siemens Healthineers built an outcome-driven imaging system (ODIS) to help achieve this change.

This content was produced by Siemens Healthineers. It was not written for MIT Technology Review’s editorial team.

Read More