Artificial intelligence is poised to become a transformation force in healthcare. How will providers and patients benefit from the impact of AI-driven tools?
The healthcare industry is ripe for some major changes. From chronic diseases and cancer to radiology and risk assessment, there are nearly endless opportunities to leverage technology to deploy more precise, efficient, and impactful interventions at exactly the right moment in a patient’s care.
As payment structures evolve, patients demand more from their providers, and the volume of available data continues to increase at a staggering rate, artificial intelligence is poised to be the engine that drives improvements across the care continuum.
AI offers a number of advantages over traditional analytics and clinical decision-making techniques. Learning algorithms can become more precise and accurate as they interact with training data, allowing humans to gain unprecedented insights into diagnostics, care processes, treatment variability, and patient outcomes.
At the 2018 World Medical Innovation Forum (WMIF) on artificial intelligence presented by Partners Healthcare, a leading researchers and clinical faculty members showcased the twelve technologies and areas of the healthcare industry that are most likely to see a major impact from artificial intelligence within the next decade.
Every member of this “Disruptive Dozen” has the potential to produce a significant benefit to patients while possessing the potential for broad commercial success, said WMIF co-chairs Anne Kiblanksi, MD, Chief Academic Officer at Partners Healthcare and Gregg Meyer, MD, Chief Clinical Officer.
With the help of experts from across the Partners Healthcare system, including faculty from Harvard Medical School (HMS), moderators Keith Dreyer, DO, PhD, Chief Data Science Officer at Partners and Katherine Andriole, PhD, Director of Research Strategy and Operations at Massachusetts General Hospital (MGH), counted down the top 12 ways artificial intelligence will revolutionize the delivery and science of healthcare.
UNIFYING MIND AND MACHINE THROUGH BRAIN-COMPUTER INTERFACES
Using computers to communicate is not a new idea by any means, but creating direct interfaces between technology and the human mind without the need for keyboards, mice, and monitors is a cutting-edge area of research that has significant applications for some patients.
Neurological diseases and trauma to the nervous system can take away some patients’ abilities to speak, move, and interact meaningfully with people and their environments. Brain-computer interfaces (BCIs) backed by artificial intelligence could restore those fundamental experiences to those who feared them lost forever.
“If I’m in the neurology ICU on a Monday, and I see someone who has suddenly lost the ability to move or to speak, we want to restore that ability to communicate by Tuesday,” said Leigh Hochberg, MD, PhD, Director of the Center for Neurotechnology and Neurorecovery at MGH.
“By using a BCI and artificial intelligence, we can decode the neural activates associated with the intended movement of one’s hand, and we should be able to allow that person to communicate the same way as many people in this room have communicated at least five times over the course of the morning using a ubiquitous communication technology like a tablet computer or phone.”
Brain-computer interfaces could drastically improve quality of life for patients with ALS, strokes, or locked-in syndrome, as well as the 500,000 people worldwide who experience spinal cord injuries every year.
DEVELOPING THE NEXT GENERATION OF RADIOLOGY TOOLS
Radiological images obtained by MRI machines, CT scanners, and x-rays offer non-invasive visibility into the inner workings of the human body. But many diagnostic processes still rely on physical tissue samples obtained through biopsies, which carry risks including the potential for infection.
Artificial intelligence will enable the next generation of radiology tools that are accurate and detailed enough to replace the need for tissue samples in some cases, experts predict.
We want to bring together the diagnostic imaging team with the surgeon or interventional radiologist and the pathologist,” said Alexandra Golby, MD, Director of Image-Guided Neurosurgery at Brigham & Women’s Hospital (BWH). “That coming together of different teams and aligning goals is a big challenge.”
“If we want the imaging to give us information that we presently get from tissue samples, then we’re going to have to be able to achieve very close registration so that the ground truth for any given pixel is known.”
Succeeding in this quest may allow clinicians to develop a more accurate understanding of how tumors behave as a whole instead of basing treatment decisions on the properties of a small segment of the malignancy.
Providers may also be able to better define the aggressiveness of cancers and target treatments more appropriately.
Artificial intelligence is helping to enable “virtual biopsies” and advance the innovative field of radiomics, which focuses on harnessing image-based algorithms to characterize the phenotypes and genetic properties of tumors.
EXPANDING ACCESS TO CARE IN UNDERSERVED OR DEVELOPING REGIONS
Shortages of trained healthcare providers, including ultrasound technicians and radiologists can significantly limit access to life-saving care in developing nations around the world.
More radiologists work in the half-dozen hospitals lining the renowned Longwood Avenue in Boston than in all of West Africa, the session pointed out.
Artificial intelligence could help mitigate the impacts of this severe deficit of qualified clinical staff by taking over some of the diagnostic duties typically allocated to humans.
For example, AI imaging tools can screen chest x-rays for signs of tuberculosis, often achieving a level of accuracy comparable to humans. This capability could be deployed through an app available to providers in low-resource areas, reducing the need for a trained diagnostic radiologist on site.
“The potential for this tech to increase access to healthcare is tremendous,” said Jayashree Kalpathy-Cramer, PhD, Assistant in Neuroscience at MGH and Associate Professor of Radiology at HMS.
However, algorithm developers must be careful to account for the fact that disparate ethnic groups or residents of different regions may have unique physiologies and environmental factors that will influence the presentation of disease.
“The course of a disease and population affected by the disease may look very different in India than in the US, for example,” she said.
“As we’re developing these algorithms, it’s very important to make sure that the data represents a diversity of disease presentations and populations – we can’t just develop an algorithm based on a single population and expect it to work as well on others.”
REDUCING THE BURDENS OF ELECTRONIC HEALTH RECORD USE
EHRs have played an instrumental role in the healthcare industry’s journey towards digitalization, but the switch has brought myriad problems associated with cognitive overload, endless documentation, and user burnout.
EHR developers are now using artificial intelligence to create more intuitive interfaces and automate some of the routine processes that consume so much of a user’s time.
Users spend the majority of their time on three tasks: clinical documentation, order entry, and sorting through the in-basket, said Adam Landman, MD, Vice President and CIO at Brigham Health.
Voice recognition and dictation are helping to improve the clinical documentation process, but natural language processing (NLP) tools might not be going far enough.
“I think we may need to be even bolder and consider changes like video recording a clinical encounter, almost like police wear body cams,” said Landman. “And then you can use AI and machine learning to index those videos for future information retrieval.
“And just like in the home, where we’re using Siri and Alexa, the future will bring virtual assistants to the bedside for clinicians to use with embedded intelligence for order entry.”
Artificial intelligence may also help to process routine requests from the inbox, like medication refills and result notifications. It may also help to prioritize tasks that truly require the clinician’s attention, Landman added, making it easier for users to work through their to-do lists.
CONTAINING THE RISKS OF ANTIBIOTIC RESISTANCE
Antibiotic resistance is a growing threat to populations around the world as overuse of these critical drugs fosters the evolution of superbugs that no longer respond to treatments. Multi-drug resistant organisms can wreak havoc in the hospital setting, and claim thousands of lives every year.
C. difficile alone accounts for approximately $5 billion in annual costs for the US healthcare system and claims more than 30,000 lives.
Electronic health record data can help to identify infection patterns and highlight patients at risk before they begin to show symptoms. Leveraging machine learning and AI tools to drive these analytics can enhance their accuracy and create faster, more accurate alerts for healthcare providers.
“AI tools can live up to the expectation for infection control and antibiotic resistance,” Erica Shenoy, MD, PhD, Associate Chief of the Infection Control Unit at MGH.
“If they don’t, then that’s really a failure on all of our parts. For the hospitals sitting on mountains of EHR data and not using them to the fullest potential, to industry that’s not creating smarter, faster clinical trial design, and for EHRs that are creating these data not to use them…that would be a failure.”
CREATING MORE PRECISE ANALYTICS FOR PATHOLOGY IMAGES
Pathologists provide one of the most significant sources of diagnostic data for providers across the spectrum of care delivery, says Jeffrey Golden, MD, Chair of the Department of Pathology at BWH and a professor of pathology at HMS.
“Seventy percent of all decisions in healthcare are based on a pathology result,” he said. “Somewhere between 70 and 75 percent of all the data in an EHR are from a pathology result. So the more accurate we get, and the sooner we get to the right diagnosis, the better we’re going to be. That’s what digital pathology and AI has the opportunity to deliver.”
Analytics that can drill down to the pixel level on extremely large digital images can allow providers to identify nuances that may escape the human eye.
“We’re now getting to the point where we can do a better job of assessing whether a cancer is going to progress rapidly or slowly and how that might change how patients will be treated based on an algorithm rather than clinical staging or the histopathologic grade,” said Golden. “That’s going to be a huge advance.”
Artificial intelligence can also improve productivity by identifying features of interest in slides before a human clinician reviews the data, he added.
“AI can screen through slides and direct us to the right thing to look at so we can assess what’s important and what’s not. That increases the efficiency of the use of the pathologist and increases the value of the time they spend for each case.”
BRINGING INTELLIGENCE TO MEDICAL DEVICES AND MACHINES
Smart devices are taking over the consumer environment, offering everything from real-time video from the inside of a refrigerator to cars that can detect when the driver is distracted.
In the medical environment, smart devices are critical for monitoring patients in the ICU and elsewhere. Using artificial intelligence to enhance the ability to identify deterioration, suggest that sepsis is taking hold, or sense the development of complications can significantly improve outcomes and may reduce costs related to hospital-acquired condition penalties.
“When we’re talking about integrating disparate data from across the healthcare system, integrating it, and generating an alert that would alert an ICU doctor to intervene early on – the aggregation of that data is not something that a human can do very well,” said Mark Michalski, MD, Executive Director of the MGH & BWH Center for Clinical Data Science.
Inserting intelligent algorithms into these devices can reduce cognitive burdens for physicians while ensuring that patients receive care in as timely a manner as possible.
ADVANCING THE USE OF IMMUNOTHERAPY FOR CANCER TREATMENT
Immunotherapy is one of the most promising avenues for treating cancer. By using the body’s own immune system to attack malignancies, patients may be able to beat stubborn tumors. However, only a small number of patients respond to current immunotherapy options, and oncologists still do not have a precise and reliable method for identifying which patients will benefit from this option.
Machine learning algorithms and their ability to synthesize highly complex datasets may be able to illuminate new options for targeting therapies to an individual’s unique genetic makeup.
“Recently, the most exciting development has been checkpoint inhibitors, which block some of the proteins made by some times of immune cells,” explained Long Le, MD, PhD, Director of Computational Pathology and Technology Development at the MGH Center for Integrated Diagnostics. “But we still don’t understand all of the disease biology. This is a very complex problem.”
“We definitely need more patient data. The therapies are relatively new, so not a lot of patients have actually been put on these drugs. So whether we need to integrate data within one institution or across multiple institutions is going to be a key factor in terms of augmenting the patient population to drive the modeling process.”
TURNING THE ELECTRONIC HEALTH RECORD INTO A RELIABLE RISK PREDICTOR
EHRs are a goldmine of patient data, but extracting and analyzing that wealth of information in an accurate, timely, and reliable manner has been a continual challenge for providers and developers.
Data quality and integrity issues, plus a mishmash of data formats, structured and unstructured inputs, and incomplete records have made it very difficult to understand exactly how to engage in meaningful risk stratification, predictive analytics, and clinical decision support.
“Part of the hard work is integrating the data into one place,” observed Ziad Obermeyer, MD, Assistant Professor of Emergency Medicine at BWH and Assistant Professor at HMS. “But another problem is understanding what it is you’re getting when you’re predicting a disease in an EHR.”
“You might hear that an algorithm can predict depression or stroke, but when you scratch the surface, you find what they’re actually predicting is a billing code for stroke. That’s very different from stroke itself.”
Relying on MRI results might appear to offer a more concrete dataset, he continued.
“But now you have to think about who can afford the MRI, and who can’t? So what you end up predicting isn’t what you thought you were predicting. You might be predicting billing for a stroke in people who can pay for a diagnostic rather than some sort of cerebral ischemia.”
EHR analytics have produced many successful risk scoring and stratification tools, especially when researchers employ deep learning techniques to identify novel connections between seemingly unrelated datasets.
But ensuring that those algorithms do not confirm hidden biases in the data is crucial for deploying tools that will truly improve clinical care, Obermeyer maintained.
“The biggest challenge will be making sure exactly what we’re predicting even before we start opening up the black box and looking at how we’re predicting it,” he said.
MONITORING HEALTH THROUGH WEARABLES AND PERSONAL DEVICES
Almost all consumers now have access to devices with sensors that can collect valuable data about their health. From smartphones with step trackers to wearables that can track a heartbeat around the clock, a growing proportion of health-related data is generated on the go.
Collecting and analyzing this data – and supplementing it with patient-provided information through apps and other home monitoring devices – can offer a unique perspective into individual and population health.
Artificial intelligence will play a significant role in extracting actionable insights from this large and varied treasure trove of data.
But helping patients get comfortable with sharing data from this intimate, continual monitoring may require a little extra work, says Omar Arnaout, MD, Co-director of the Computation Neuroscience Outcomes Center and an attending neurosurgeon at BWH.
“As a society, we’ve been pretty liberal with our digital data,” he said. But as things come into our collective consciousness like Cambridge Analytica and Facebook, people will become more and more prudent about who they share what kinds of data with.”
However, patients tend to trust their physicians more than they might trust a big company like Facebook, he added, which may help to ease any discomfort with contributing data to large-scale research initiatives.
“There’s a very good chance [wearable data will have a major impact] because our care is very episodic and the data we collect is very coarse,” said Arnaout. “By collecting granular data in a continuous fashion, there’s a greater likelihood that the data will help us take better care of patients.”
MAKING SMARTPHONE SELFIES INTO POWERFUL DIAGNOSTIC TOOLS
Continuing the theme of harnessing the power of portable devices, experts believe that images taken from smartphones and other consumer-grade sources will be an important supplementto clinical quality imaging – especially in underserved populations or developing nations.
The quality of cell phone cameras is increasing every year, and can produce images that are viable for analysis by artificial intelligence algorithms. Dermatology and ophthalmology are early beneficiaries of this trend.
Researchers in the United Kingdom have even developed a tool that identifies developmental diseases by analyzing images of a child’s face. The algorithm can detect discrete features, such as a child’s jaw line, eye and nose placement, and other attributes that might indicate a craniofacial abnormality. Currently, the tool can match the ordinary images to more than 90 disorders to provide clinical decision support.
“The majority of the population is equipped with pocket-sized, powerful devices that have a lot of different sensors built in,” said Hadi Shafiee, PhD, Director of the Laboratory of Micro/Nanomedicine and Digital Health at BWH.
“This is a great opportunity for us. Almost every major player in the industry has started to build AI software and hardware into their devices. That’s not a coincidence. Every day in our digital world, we generate more than 2.5 million terabytes of data. In cell phones, the manufacturers believe they can use that data with AI to provide much more personalized and faster and smarter services.”
Using smartphones to collect images of eyes, skin lesions, wounds, infections, medications, or other subjects may be able to help underserved areas cope with a shortage of specialists while reducing the time-to-diagnosis for certain complaints.
“There is something big happening,” said Shafiee. “We can leverage that opportunity to address some of the important problems with have in disease management at the point of care.”
REVOLUTIONIZING CLINICAL DECISION MAKING WITH ARTIFICIAL INTELLIGENCE AT THE BEDSIDE
As the healthcare industry shifts away from fee-for-service, so too is it moving further and further from reactive care. Getting ahead of chronic diseases, costly acute events, and sudden deterioration is the goal of every provider – and reimbursement structures are finally allowing them to develop the processes that will enable proactive, predictive interventions.
Artificial intelligence will provide much of the bedrock for that evolution by powering predictive analytics and clinical decision support tools that clue providers in to problems long before they might otherwise recognize the need to act.
AI can provide earlier warnings for conditions like seizures or sepsis, which often require intensive analysis of highly complex datasets.
Machine learning can also help support decisions around whether or not to continue care for critically ill patients, such as those who have entered a coma after cardiac arrest, says Brandon Westover, MD, PhD, Director of the MGH Clinical Data Animation Center.
Typically, providers must visually inspect EEG data from these patients, he explained. The process is time-consuming and subjective, and the results may vary with the skill and experience of the individual clinician.
“In these patients, trends might be slowly evolving,” he said. “Sometimes when we’re looking to see if someone is recovering, we take the data from ten seconds of monitoring at a time. But trying to see if it changed from ten seconds of data taken 24 hours ago is like trying to look if your hair is growing longer.”
“But if you have an AI algorithm and lots and lots of data from many patients, it’s easier to match up what you’re seeing to long term patterns and maybe detect subtle improvements that would impact your decisions around care.”
Leveraging AI for clinical decision support, risk scoring, and early alerting is one of the most promising areas of development for this revolutionary approach to data analysis.
By powering a new generation of tools and systems that make clinicians more aware of nuances, more efficient when delivering care, and more likely to get ahead of developing problems, AI will usher in a new era of clinical quality and exciting breakthroughs in patient care.