Publication

Article

Evidence-Based Oncology

July 2024
Volume30
Issue 8
Pages: SP572-SP573

ASCO 2024 Opening Session

Author(s):

AI Has Already Created Productivity Gains and Ecosystem Transformation Is Coming

Oncologists have always integrated new technology and advancements into cancer care, and they should also embrace artificial intelligence (AI) as a tool to personalize care, American Society of Clinical Oncology (ASCO) President Lynn M. Schuchter, MD, FASCO, said during the President’s Address June 1, 2024.

In her speech, she highlighted the topics of the 2024 ASCO Annual Meeting, including the focus on AI. The theme of this year’s meeting was “The Art and Science of Cancer Care: From Comfort to Cure.

Lynn M. Schuchter, MD | Image credit: Penn Medicine

Lynn M. Schuchter, MD | Image credit: Penn Medicine

“We’re at a critical juncture,” said Schuchter, who is the C. Willard Robinson Professor and chief of the Division of Hematology-Oncology in the Perelman School of Medicine at the University of Pennsylvania in Philadelphia. “Scientific innovations like artificial intelligence are dramatically altering and advancing all aspects of our field.”

The utilization in cancer care is not a distant reality, she added. These tools are being integrated into research and care. At the start of the annual meeting, ASCO released a new framework of principles to help oncologists understand how to responsibly use AI in oncology.1 The framework includes 6 guiding principles to apply in the development and implementation of AI to safely use the technology to the benefit of patients and clinicians.

AI can analyze complex MRIs, assist in choosing treatments, identify clinical trials, and even help reduce burnout through automating tasks. However, there are limitations to the technology from a personal aspect.

“Because even as we stand on the brink of this new scientific and technological era in medicine, remember that no algorithm, no machine can explain to a patient with human compassion what their cancer is, their choices, and what their future may hold,” Schuchter said. “Nor could AI replace us in understanding what a patient’s values are, their needs and desires, and those of their loved ones.”

Evolution of AI in Medicine
Following the president, Jonathan M. Carlson, PhD, took the stage to discuss general AI for biomedicine. He explained how quickly AI technology is evolving by comparing ChatGPT, which many people have at least tried, and GPT-4. He admitted that losing sleep over AI and its potential is not uncommon and even expected.

Jonathan Carlson, PhD | Image credit: Microsoft

Jonathan M. Carlson, PhD | Image credit: Microsoft

“Why is this? In short, it’s the unreasonable ability of machines to reason,” he said. “For the last decade, AI has been growing at an exponential rate and the thing about exponential curves is they grow slowly until they don’t.”

Where we thought we would be after the technology started becoming more widely used last year is very far behind where we actually are today, Carlson explained. New mathematical models of attention have given AI the ability to distill information and use context. Previously, this was hard for machines to do. Another advance has been “self-supervision” or the ability to train machines on all data to rapidly learn. As a result of these 2 advancements a third was born: scaling laws, which state that “more is better.” The scaling law explains why we are on the exponential curve, he said.

In biomedicine, machines now can reason over patients, populations, and biology, and this is happening through 2 forms of technology disruption: (1) technology substitution, in which it fits into existing paradigms and improves productivity by updating outdated technology; and (2) ecosystem transformation, when there is a fundamental paradigm shift that changes everything, including business models.

He provided the example of talking to GPT as a doctor and providing the case of a patient with swelling of her legs and other symptoms. The AI identifies 2 different diagnoses and recommends a set of tests to order and potential treatment if a diagnosis is confirmed.

He followed up the conversation with more information and then the incorrect diagnosis, and GPT-4 disagreed and explained why that diagnosis is not likely to be correct.

“If you think about this for a minute, this is a machine [and] this is the sort of conversation you would want to have, you would expect to have with our colleagues,” Carlson said. The machine was able to distill the medical information and additional context from his response and not only come up with the diagnosis but also explain it and what to do going forward.

In addition to ASCO, the American Medical Association, the World Health Organization, and the National Academy of Medicine are releasing guidelines on safely and effectively using AI in medicine.2-4

In addition to the medical community, the general public uses AI because it is general accessible technology. This can be a story of patient empowerment but it raises questions about how the technology should be used and integrated in the system, Carlson said.

“In short, it shows that we are seeing the signs of impending ecosystem transformation,” he said.

Meaningful technology substitution is already happening. He went on to show how useful the technology can be to the busy clinician, asking it to write a referral note to another specialist and a medical encounter note in the proper format including the correct International Statistical Classification of Diseases, Tenth Revision, billing codes. A single model can take the unedited patient transcript and with a single prompt output dozens of administrative artifacts to save 30 to 60 minutes of administrative time per day, with lower rates of burnout.

The technology can also structure unstructured information, which is powerful and unlocks decades of statistics and machine learning that have to work with structured information, Carlson explained.

In another sign of technology substitution, research has shown that GPT-4 can outperform study staff in matching patients to clinical trials and at a tiny fraction of the cost.5 Unstructured real-world patient data can be used to design clinical trials, design virtual control arms, conduct comparative effectiveness studies, and more, and the “decades-long dream of real-world evidence, of learning from people to treat the person, is now technologically feasible,” Carlson said.

The challenges are related to the ecosystem: where the data is, who owns the data, how the data can be ethically used, the business model, and the regulatory model.

The year 2024 will be about large multimodal models and the emergency of multimodal generative AI, Carlson said. It took a picture of his own audiology report, which was impossible for him to read because he didn’t understand the symbols and couldn’t read the written notes. But when he took a picture and uploaded it to GPT-4, the machine broke down the symbols and acronyms, interpreted the graph, and helped him come to a conclusion.

“I have to tell you, as a patient, as a father of patients, this ability to demystify the medical process is very, very helpful,” he said. However, the ability is limited, because a pathology image results in GPT-4 spitting out “garbage” because there aren’t enough pathology images in the training and the models don’t scale well to pathology images. But it is possible to adapt the model, and there is already research making progress in this area, he said.

Challenges of Models Remain
Carlson did not avoid the known issues with the models, such as hallucinations that cause them to make things up, math and logic errors, reflected biases of training data, data privacy issues, and the underdeveloped regulations. However, he noted that since he created this presentation 14 months ago, he has seen tremendous progress across the board on these issues. There is still a way to go, though.

“There [are] many, many applications where we think AI will be useful, but we should not yet use it,” he said. “But the reality is that the progress—the technological progress—is continuing to improve exponentially.”

At this point, Carlson believes we have already crossed the bar for AI to achieve meaningful productivity gains and there are signs of impending ecosystem transformation. He doesn’t think we have reached “peak AI,” and the question now is how humans will make themselves more adaptable.
He asked the audience to familiarize themselves with AI as a tool and actively participate in shaping the new paradigm of care using AI to ensure it benefits all patients.

“If we don’t [adapt], this paradigm shift will happen without us. It will happen in ways that maybe we don’t like,” Carlson said. “And so, the question is: How do we ensure that this is for the benefit of all patients?”

References
1. ASCO sets six guiding principles for AI in oncology. News release. ASCO. May 31, 2024. Accessed June 13, 2024. https://society.asco.org/news-initiatives/policy-news-analysis/asco-sets-six-guiding-principles-ai-oncology
2. Principles for augmented intelligence development, deployment, and use. American Medical Association. Approved November 14, 2023. Accessed June 13, 2204. www.ama-assn.org/system/files/ama-ai-principles.pdf
3. WHO releases AI ethics and governance guidance for large multi-modal models. News release. World Health Organization. January 18, 2024. Accessed June 13, 2024. https://www.who.int/news/item/18-01-2024-who-releases-ai-ethics-and-governance-guidance-for-large-multi-modal-models
4. Health care artificial intelligence code of conduct. Accessed June 13, 2024. https://nam.edu/programs/value-science-driven-health-care/health-care-artificial-intelligence-code-of-conduct/
5. Unlu O, Shin J, Mailly CJ, et al. Retrieval augmented generation enabled Generative Pre-trained Transformer 4 (GPT-4) performance for clinical trial screening. medRxiv. Preprint. Published online February 8, 2024. doi:10.1101/2024.02.08.24302376

AI Applications in Oncology for Clinicians and Patients

Both predictive artificial intelligence (AI) and generative AI have many applications in oncology, explained James Zou, PhD, of Stanford University.
Predictive AI provides “simple” and structured output, such as a binary prediction of whether or not a patient has cancer, while generative AI provides a richer and more flexible output, such as paragraphs of text, molecules for drug dscovery, or full images.

James Zou, PhD | Image credit: Stanford University

James Zou, PhD | Image credit: Stanford University

Predictive models are already being used to diagnose cancer and predict treatment response. These models can analyze images and molecular data to make diagnoses.1 Models are also predicting treatment response with real-world clinicogenomics by modeling the patient trajectory using mutation and ECOG status to predict progression.2

Generative AI can move beyond diagnosis and predictions. These models can design more inclusive clinical trials by using real-world data to design eligibility criteria using simulations from electronic health record data to suggest how to set different eligibility criteria to enroll more diverse patients without compromising on safety, efficacy, and outcomes.3 This is already being used by Roche, Genentech, and others to enroll more diverse patients, according to Zou.

Another use is as a copilot for clinicians when they encounter data or images they’re not familiar with. The chatbot copilot can assist the clinicians with clinical decision support by pulling relevant information from textbooks and other sources.4

Finally, generative AI can simplify medical information for patients. Medical consent forms are difficult, for instance, and AI models can help simplify them so patients are actually informed when they read through these documents.5 Lifespan, the largest health care system in Rhode Island, is using AI to do this.6

“Now the AI space is very…quickly developing, and it’s really critical and important of us to be evaluating and monitoring the performance of these AI models to ensure they are safe and deployed responsibly,” Zou concluded.

References
1. Swanson K, Wu E, Zhang A, Alizadeh AA, Zou J. From patterns to patients: advances in clinical machine learning for cancer diagnosis, prognosis, and treatment. Cell. 2023;186(8):1772-1791. doi:10.1016/j.cell.2023.01.035
2. Liu R, Rizzo S, Waliany S, et al. Systematic pan-cancer analysis of mutation-treatment interactions using large real-world clinicogenomics data. Nat Med. 2022;28(8):1656-1661. doi:10.1038/s41591-022-01873-5
3. Liu R, Rizzo S, Whipple S, et al. Evaluating eligibility criteria of oncology trials using real-world data and AI. Nature. 2021;592(7855):629-633. doi:10.1038/s41586-021-03430-5
4. Huang Z, Bianchi F, Yuksekgonul M, Montine TJ, Zou J. A visual-language foundation model for pathology image analysis using medical Twitter. Nat Med. 2023;29(9):2307-2316. doi:10.1038/s41591-023-02504-3
5. Mirza FN, Tang OY, Connolly ID, et al. Using ChatGPT to facilitate truly informed medical consent. NEJM AI. 2024;1(2). doi:10.1056/AIcs2300145
6. Morse B. RI doctors simplify medical consent forms through AI. NBC 10 News. January 17, 2024. Accessed June 14, 2024. https://turnto10.com/features/health-landing-page/ri-doctors-simplify-medical-conseant-forms-through-a-i-to-improve-patient-understanding-southern-new-england-rhode-island-january-17-2024

Related Videos
Matias Sanchez, MD
Sandra Cuellar, PharmD
Matias Sanchez, MD
Screenshot of an interview with Nadine Barrett, PhD
Divya Gupta, MD
4 KOLs are featured in this series
4 KOLs are featured in this series
4 KOLs are featured in this series
4 KOLs are featured in this series
4 KOLs are featured in this series
Related Content
AJMC Managed Markets Network Logo
CH LogoCenter for Biosimilars Logo