Multimodal Monitoring and Neurocritical Care Bioinformatics: The Dawn of a New Age

December 23, 2008

Share:

By J. Claude Hemphill III, MD, MAS, University of California, San Francisco, California, and Michael De Georgia, MD, Case Western Reserve University, Cleveland, Ohio

Hemphill
De Georgia

Neurocritical Care is a growth industry. New approaches to treating stroke, head and spinal cord trauma, global cerebral ischemia after cardiac arrest, and status epilepticus have brought neurologists back to the bedside of critically ill patients, engaged enthusiastic trainees, and raised public expectations about advances in patient care. Central to this growth has been the development of new monitoring technology. FDA-approved devices now exist for monitoring cerebral blood flow, brain tissue oxygen tension, relative tissue oxygen saturation, brain temperature, and brain metabolism. This is on top of "old" and familiar monitors for intracranial pressure and electroencephalography. Because each of these new technologies offers a slightly different perspective on brain physiology, there is a growing belief that combining the monitors together (so-called "multimodal monitoring") may produce a more accurate overall picture. Multimodal monitoring must then be integrated with the systemic monitoring that is commonly performed in the intensive care unit (ICU). Before we get to that, though, it is useful to consider the purpose of monitoring in neurocritical care.

What is the purpose of monitoring the brain in neurocritical care?

The main goal of neurocritical care is the prevention, identification, and treatment of secondary brain injury. Hypotension, hypoxia, fever, seizures, and hyper- and hypoglycemia all exacerbate primary brain injury and worsen outcome. Neuromonitoring should provide a way to detect the impact of these and other secondary insults before they lead to permanent additional injury. In neurocritical care patients, the traditional neurologic examination is often compromised by the very nature of the primary event. It is of limited utility in the ongoing assessment of the impact of secondary insults. This increases the importance of using alternative monitoring methods, such as those described above. But we can't know what the monitors are telling us if we don't know how to use the data they provide.

Aren't we doing a good job already?

The honest answer is: we don't know. The ICU is a complex, data-intense environment (see figure). Physiologic data is acquired, continuously or intermittently, using devices from a variety of different manufacturers. Dozens of systemic parameters are often monitored, including hemodynamics, blood pressure, heart rate, respiratory rate, and pulse oximetry. Neuromonitoring is superimposed on this systemic monitoring. And while the number of monitors has grown exponentially since the origins of critical care almost fifty years ago, the reality is that we look at the data essentially the same way.


Neurocritical care bed at San Francisco
General Hospital (courtesy of Geoff Manley, MD, PhD)

Most ICUs still use paper charts with information hand-written by bedside nurses; current electronic charting systems generally just recapitulate this record rather than provide additional analysis. The truth is that when we stand at the bedside of a critically ill patient in 2008, it feels like we're looking at a giant Excel spreadsheet of raw data and using a divining rod to help point the direction.

Dr. William Hanson, Director of the Surgical Intensive Care Unit at the University of Pennsylvania, likens this approach to the tickertape strategy of the early stock market: "The way we chart information at the ICU bedside is archaic, abstruse, and illogical. Multiple streams of continuous, electronic critical care data are sampled on some regular or irregular basis and laboriously transcribed onto a critical care record in an arbitrary arrangement, one that varies substantially from ICU to ICU. It is not much of a stretch to compare a 1960s-era stockbroker poring over a length of tickertape with today's intensivist leafing through a ream of ICU records."1

Where do we go from here?

Before we can incorporate our neuromonitoring tools, much work needs to be done on the basic ICU information technology infrastructure.

  • First, the physiologic data needs to be integrated and time-synchronized into one secure dataset. Proprietary limitations from industry and other technical challenges currently make this difficult.
  • Second, processing or analysis of the data needs to be performed. Most bedside monitors numerically display three- to five-second time-averaged data, but even basic statistical analyses (mean, median, standard deviations) are nearly impossible to perform on the fly.
  • Moreover, conceptually, data analysis has been restricted to univariate models focusing on relatively simplistic identification of events, typically when predetermined thresholds are crossed (for example, when the systolic blood pressure falls below 90 mm Hg or the intracranial pressure rises over 20 mm Hg).

Thus, we continue to treat in a univariate manner, while our patients undoubtedly exist in a multivariate world. We believe there may be untapped and clinically valuable information buried in commonly acquired physiologic signals that could provide insight into the overall physiologic state of the patient. Our inability to integrate, clean, store, analyze, and visualize the enormous amount of bedside physiological data has made it almost impossible to extract this information. We're left in the situation where we don't even know what we don't know.

Managing ICU complexity

This is not the case in other areas of medicine and in other industries. Advances in the field of complex systems have yielded an array of new techniques for analysis. For example, looking at the "area-under-the-curve" is now standard in pharmacokinetics. Multivariable regression analysis is standard in many areas of research, and advanced statistical analysis has been a major factor in human genome sequencing efforts. The term "bioinformatics" has emerged to encompass these efforts. Medical "informatics," on the other hand, has been generally understood only in the context of computerized medical records, billing databases, and compliance checklists for regulatory review (such as by the Joint Commission). It is possible that the application of these advanced analytical techniques to the ICU, "critical care bioinformatics," could potentially identify early warning signals hidden in physiologic patterns.

Early attempts have been made to help bring order to this informatics chaos. Several data acquisition systems have been developed and used in initial studies demonstrating the added value of assessing cerebral autoregulation,2 area-under-the-curve as a "dose" of secondary brain insults, and nascent attempts at integrating multiple continuous parameters at the same time3—but more work is needed.

Intensive care medicine today has become "the art of managing extreme complexity.4 As neurointensivists, we use both clinical expertise and new multimodal monitoring technology in our attempt to achieve this art. In the future, neuromonitors will need to be integrated with one another and also integrated with systemic monitors. For years we have been limited by insufficient computational power, a lack of specialized software, and incompatibility between monitoring equipment and systems for data collection and analysis. We will need standard systems of data acquisition, integration, real-time data analysis, and clear, user-friendly visualization.

Coordination = Benefits

This is a job beyond clinicians. A coordinated effort involving clinicians, engineers, computer scientists, experts in informatics and complex biostatistics, and industry will likely be required to truly move this field of "critical care bioinformatics" forward. The potential payoffs are huge: better insight into complex physiology, early detection of secondary insults, reduction in medical errors, improved efficiency, and most importantly, better patient outcomes. We believe that this approach could fundamentally change the way medicine is practiced.

A new day is dawning for neurocritical care. But the road is long and the challenges immense. Recognition is the first step. Now the real work begins.

References

1. Hanson CW. Ticker tape medicine. Crit Care Med 2004;32:3551-3552.

2. Smielewski P, Czosnyka M, Steiner L, Belestri M, Piechnik S, Pickard JD. ICM+: software for on-line analysis of bedside monitoring data after severe head trauma. Acta Neurochir Suppl 2005;95:43-49.

3. Sorani MD, Hemphill JC, 3rd, Morabito D, Rosenthal G, Manley GT. New approaches to physiological informatics in neurocritical care. Neurocrit Care 2007;7:45-52.

4. Gawande A. The Checklist. The New Yorker 2007 December 10, 2007.

Author Disclosures

Dr. Hemphill has received personal compensation from Novo Nordisk, UCB Pharma, Innercool Therapies, Medivance, and Ornim within the last 24 months. In the same period, he served on the editorial board of Neurocritical Care, as well as serving as expert witness consultant regarding neurocritical care and stroke. Dr. Hemphill has received research support from NIH/NINDS, Novo Nordisk, and Maxygen within the last five years.

Dr. De Georgia has served as a paid member of the International Advisory Board of Orsan Medical Technologies Ltd., Netanya, Israel.