FEATURE ARTICLE | JUL-AUG 2016 ISSUE

Viewpoints: Can Big Data Elevate the Standard of Care in Neurology and Neurosurgery?

Viewpoints Can Big Data Elevate the Standard of Care in Neurology and Neurosurgery
Media formats available:

In the fields of neurology and neurosurgery, the growing array of assessment and treatment options can complicate the process of defining a care plan that adequately meets the specific needs of individual patients. A key barrier to assessing how a multitude of factors such as demographics, disease diagnosis, prior treatment history, and adverse events impact treatment outcomes is a lack of large data sets that can provide appropriate statistical power—typically on the order of tens to hundreds of thousands of patients. Until recently, it was not feasible to conduct such large and robust analyses. The rise of big data analytics, however, harnessing sophisticated algorithms and tremendous computing power, now allows researchers and clinicians to answer critical questions that have the potential to drive optimum outcomes for every patient.

Big Data Is Altering the Healthcare Landscape

The accumulation and use of large data sets began changing the healthcare landscape long before the phrase “big data” was coined. For example, the seminal Framingham Heart Study, initiated in 1948, leveraged data from more than 5,000 individuals to provide critical insights into the epidemiology of cardiovascular disease. Now in its third generation of participants, this study demonstrated the power of large data sets to inform clinical practice and is still shaping how heart health is managed in the 21st century.

More recently, an analysis of hundreds of thousands of gene expression profiles allowed researchers at Stanford University to determine that a seldom used drug previously approved to treat depression (imipramine) is effective in treating small cell lung cancer.1 Identifying a potential new use for an existing medication allowed for the rapid initiation of clinical trials, saving several years and millions of dollars in development costs compared with creating a new lung cancer therapy from scratch.

Harnessing and efficiently analyzing large sets of data are essential for bringing precision medicine to the next level. The American Association for Cancer Research’s Project GENIE (Genomics, Evidence, Neoplasia, Information, Exchange) provides statistical power to support clinical decision making for rare cancers and atypical variants of common cancers. The GENIE registry aggregates clinical data and outcomes from tens of thousands of cancer patients treated at institutions around the world, and is expected to enable biomarker validation, identification of new drug targets, repurposing of existing medications for use in new cancer indications, and provide evidence to support reimbursement for next-generation diagnostic tests and therapies.

It should be noted that collecting and sharing detailed patient data requires the implementation of robust controls to protect confidentiality. Such protections must ensure that individual patient identities are not disclosed and that the data are only available to authorized users who agree to adhere to appropriate security and privacy protocols.

Applications of Big Data in Neurology

As a field using cutting edge imaging and functional assessments to improve patient care, clinical neuroscience has much to gain from big data analyses. Prospective studies have been undertaken successfully to gather treatment data in indications such as lumbar spine surgery, spinal cord injury, traumatic brain injury, and stereotactic radiosurgery (SRS). While these studies demonstrate that insights from prospectively collected treatment data can be used to guide patient care in multiple areas of neurology and neurosurgery, few of the data sets used in these studies include more than 1,000 patient records, limiting their impact. However, clinical registries are now far exceeding these numbers.

Realizing the full potential of big data in neurology will require data sets comprising 10,000-100,000 patients in order to have adequate statistical power to provide the insights needed to advance patient care and improve survival and quality of life during and after treatment. The ability to generate data that supports the safety and efficacy of particular treatment pathways is also important for meeting reimbursement requirements under the Affordable Care Act.

While generating data sets of this size may seem daunting, recent advances in cloud computing, computational algorithms, and data encryption are enabling the development of large patient registries. Ideally, such registries will enable data capture and analysis at the local institutional level while also supporting data sharing and analysis on a global scale.

As a clinician and researcher, together with colleagues at our institution and other sites, we launched a prospective data collection system called the Leksell Gamma Knife Registry (http://static.elektra.com/lgkregistry/), which is designed to facilitate the capture and analysis of data across all aspects of SRS used in the clinical neurology setting.2 Our goal was to create new data sets, specific for brain radiosurgery, that are updated continuously and securely over the web, allowing clinicians to obtain data queries in real time to benchmark their own data with those nationally or globally. Importantly, underlying these high level views is a rich database with detailed treatment and follow-up data, which is ideally suited for research. With this information at our fingertips, we may be able to effect real change in medicine. Although the specifics of this system are focused on SRS, many of its key features are relevant to other areas of neurology and neurosurgery. As such, it may provide a starting point for the development of additional registries focused on a variety of additional neurology indications.

For a data collection system to be successful, it must meet several criteria. Unlike a standard electronic medical record (EMR), the system must support efficient mining of text, imaging data, and results of other neurological assessments. Additionally, the system must accommodate the needs and constraints of today’s busy healthcare settings, which means minimal record entry time and compatibility with routine clinical workflow. In our current SRS registry project, it takes approximately six minutes to enter a new record and 45 seconds to update an existing record. Additionally, it should support analyses based on disease categories or subtypes, a variety of demographic criteria and treatment-specific parameters. The SRS system also includes pre-designed dashboards that provide visual data summaries for common queries (e.g., schwannoma, hearing scores, and brain metastases). Similar templates could be developed in registries for other indications in order to simplify the user experience.

Another critical aspect of any successful patient registry is ensuring that data entry fields are aligned with standard nomenclature and that all users adhere to the same data entry criteria (the “data dictionary”). Similarly, a key learning from the initial testing of the SRS registry is the need for ongoing quality control and data auditing processes. This aspect of data management becomes more critical as registries accept data from a growing number of clinical sites and investigators. Expanding participation in global registries requires the use of standard technology platforms that should be readily available to collaborators at healthcare facilities in both developed and resource-constrained countries.

Developing global registries also necessitates addressing cross-country differences in regulations related to the acquisition, use, and sharing of patient data. While these differences are not trivial in many cases, the benefits to be gained by increasing participation in such registries demands that all interested parties work together to find common ground. Similarly, the platform needs to be flexible enough to support multiple objectives, including clinical research, quality improvement initiatives and cost effectiveness analyses.

Conclusion

The ongoing efforts to leverage big data to optimize the use of SRS are important steps toward improving patient outcomes in several important disease areas. However, the application of big data approaches to addressing key questions in clinical neurology is just beginning. Developing additional registries to explore other neurological indications and interventions is essential as we continue to strive to ensure that every one of our patients receives optimal and informed care. n

Douglas Kondziolka, MD is a Professor of Neurosurgery and Radiation Oncology at New York University Langone Medical Center, where he also serves as the director of the Center for Advanced Radiosurgery. He launched the first global registry platform for stereotactic radiosurgery.

1. NS Jachan et al. A drug repositioning approach identifies tricyclic antidepressants as inhibitors of small cell lung cancer and other neuroendocrine tumors. Cancer Discov. 2013; 3(12):1364-77.

2. D Kondziolka et al. Development, implementation, and use of a local and global clinical registry for neurosurgery. Big Data 2015; 3(2):80-89.

Completing the pre-test is required to access this content.
Completing the pre-survey is required to view this content.

Ready to Claim Your Credits?

You have attempts to pass this post-test. Take your time and review carefully before submitting.

Good luck!

Register

We're glad to see you're enjoying PracticalNeurology…
but how about a more personalized experience?

Register for free