Artificial intelligence virtual consultant could guide interventional radiology patient care

5210

Interventional radiologists at the University of California at Los Angeles (UCLA), Los Angeles, USA, are using technology found in self-driving cars to power a machine learning application that helps guide patients’ interventional radiology care, according to research presented today at the Society of Interventional Radiology’s 2017 annual scientific meeting (4–9 March, Washington, DC, USA).

The researchers used cutting-edge artificial intelligence to create a “chatbot” interventional radiologist that can automatically communicate with referring clinicians and quickly provide evidence-based answers to frequently asked questions. This allows the referring physician to provide real-time information to the patient about the next phase of treatment, or basic information about an interventional radiology treatment.

“We theorised that artificial intelligence could be used in a low-cost, automated way in interventional radiology as a way to improve patient care,” said Edward W Lee, assistant professor of Radiology at UCLA’s David Geffen School of Medicine and one of the authors of the study. “Because artificial intelligence has already begun transforming many industries, it has great potential to also transform healthcare.”

In this research, deep learning was used to understand a wide range of clinical questions and respond appropriately in a conversational manner similar to text messaging. Deep learning is a technology inspired by the workings of the human brain, where networks of artificial neurons analyse large datasets to automatically discover patterns and “learn” without human intervention. Deep learning networks can analyse complex datasets and provide rich insights in areas such as early detection, treatment planning, and disease monitoring.

“This research will benefit many groups within the hospital setting. Patient care team members get faster, more convenient access to evidence-based information; interventional radiologists spend less time on the phone and more time caring for their patients; and, most importantly, patients have better-informed providers able to deliver higher-quality care,” said co-author Kevin Seals, resident physician in Radiology at UCLA and the programmer of the application.

The UCLA team enabled the application, which resembles online customer service chats, to develop a foundation of knowledge by feeding it more than 2,000 example data points simulating common inquiries interventional radiologists receive during a consultation. Through this type of learning, the application can instantly provide the best answer to the referring clinician’s question. The responses can include information in various forms, including websites, infographics, and custom programmes. If the tool determines that an answer requires a human response, the programme provides the contact information for a human interventional radiologist. As clinicians use the application, it learns from each scenario and progressively becomes smarter and more powerful.

The researchers used a technology called Natural Language Processing, implemented using IBM’s Watson artificial intelligence computer, which can answer questions posed in natural language and perform other machine learning functions. This prototype is currently being tested by a small team of hospitalists, radiation oncologists and interventional radiologists at UCLA.

“I believe this application will have phenomenal potential to change how physicians interact with each other to provide more efficient care,” said John Hegde, resident physician in radiation oncology at UCLA. “A key point for me is that I think it will eventually be the most seamless way to share medical information. Although it feels as easy as chatting with a friend via text message, it is a really powerful tool for quickly obtaining the data you need to make better-informed decisions.”