Even with advancements in technology, the prospect of artificial intelligence treating patients with cancer still remains out of reach. At least, for now.
Andrew Seidman, MD
Even with advancements in technology, the prospect of artificial intelligence that can treat patients with cancer still remains out of reach.
At least, for now.
That may soon change, according to Andrew Seidman, MD, a medical oncologist at Memorial Sloan Kettering Cancer Center (MSK), who is helping IBM develop an application called Watson for Oncology. Watson has potential to be a valuable partner in helping him process the deluge of new information in his field, he said.
"We are faced with exponentially growing data from all over the world," Seidman said during a presentation at the 19th Annual Lynn Sage Breast Cancer Symposium. "At best, I might scan some article titles at the end of a long day, but it's hard for me to stay at the cutting edge and feel I'm making optimal decisions."
Seidman said oncologists are particularly challenged to keep up with the data regarding cancer-associated genes. While genomic analysis may yield a long list of actionable cancer biomarkers for a given patient, Seidman says the list is about as useful to most oncologists as a clay tablet full of hieroglyphics when it comes to making treatment decisions. Crunching large amounts of data is where computers shine, and computer decision support will become increasingly important as genomic discoveries and new treatments proliferate.
But Watson needs to be taught how to crunch, and Seidman and his MSKCC colleagues spend a lot of time figuring out how to instill their expertise into Watson's computations. He estimates that Watson for Oncology and its team are "beyond 8th grade, but not yet in high school." For each individual case used to train Watson, Seidman estimates that a human physician needs to spend 5 to 7 minutes telling the computer everything it needs to know. While Watson may do well at aggregating research findings, it needs help understanding an individual patient's ability to tolerate a treatment.
The Watson for Oncology development team's "to-do" list is long:
Connect Watson's treatment choices with outcomes. Though several hospitals have already availed themselves of Watson for Oncology for treatment recommendations, Seidman acknowledges that the work needs to be validated. Two basic studies need to be done. "The first level is, does Watson lead to physicians changing their preconceived treatment recommendations?" he said. "That kind of study is just beginning." The second is more difficult, though more important: measuring whether Watson's recommended treatment plans lead to better patient outcomes than treatment plans left entirely to physician discretion. "We are thinking about how you would design that kind of clinical trial," Seidman said.
While Seidman does not expect to be replaced, Watson may help him and other subspecialists share their expertise more widely than they can now. "There are only so many consultations you can give, and [having access to Watson] could democratize cancer consultations." A Watson-based consultation would lack the give-and-take that two physicians can have over the phone or in email, but the computer can link the physician to relevant literature and do comparative tabulations of outcomes for different regimens. Developments in natural language processing may enable actual dialogue someday, Seidman says.
Even now, Watson occasionally outsmarts its teachers, coming up with treatment recommendations that they recognize as superior to their own. During training, which is an iterative process, Watson and MSK oncologists assess possible treatments for each case, scoring each one green (recommended), yellow (for consideration), or red (not recommended). The team hopes for overall agreement, and dreads "red-green" disagreement. Most common is a "yellow-green" disagreement. While human judgment still makes the better choice most of the time, Seidman says, “Sometimes we realize, 'Watson got that one right and we got it wrong.'"