Article
Author(s):
Massachusetts General Hospital (MGH) is using artificial intelligence (AI) to process screening mammographies after a retrospective study found that AI can measure breast density—a risk factor for having tumors that could be missed during a scan—at the screening level of an experienced radiographer.
Massachusetts General Hospital (MGH) is using artificial intelligence (AI) to process screening mammographies after a retrospective study found that AI can measure breast density—a risk factor for having tumors that could be missed during a scan—at the screening level of an experienced radiographer.
Because the “deep learning algorithm” had such a high level of agreement with experienced humans, “this algorithm has the potential to standardize and automate routine breast density assessment,” researchers wrote in the journal Radiology last month.
In this retrospective study, an AI network was trained to assess Breast Imaging Reporting and Data System (BI-RADS) breast density based on the original interpretation by an experienced radiologist of 41,479 digital screening mammograms obtained in 27,684 women from January 2009 to May 2011. The resulting algorithm was tested on a held-out test set of 8677 mammograms in 5741 women.
In addition, 5 radiologists performed a reader study on 500 mammograms randomly selected from the test set.
The algorithm was then implemented in routine clinical practice, where 8 radiologists reviewed 10,763 consecutive mammograms assessed with the model. Agreement on BI-RADS category for the AI model and for 3 sets of readings—radiologists in the test set, radiologists working in consensus in the reader study set, and radiologists in the clinical implementation set—were estimated with linear-weighted κ statistics and were compared across 5000 bootstrap samples to assess significance.
The deep learning (DL) model showed good agreement with radiologists in the test set (κ = 0.67; 95% CI, 0.66-0.68) and with radiologists in consensus in the reader study set (κ = 0.78; 95% CI, 0.73-0.82). There was very good agreement (κ = 0.85; 95% CI, 0.84-0.86) with radiologists in the clinical implementation set; for binary categorization of dense or nondense breasts, 10,149 of 10,763 (94%; 95% CI, 94%-95%) DL assessments were accepted by the interpreting radiologist.
In a statement, the researchers said the approach represents a groundbreaking collaboration that has already made its way into clinical practice. Since January, the system has been in continuous operation at MGH since January and has processed approximately 16,000 images.
According to MGH, the algorithm predicts which lesions need to be surgically removed, sparing women unnecessary surgeries.
Previous research has shown variation in radiologists' interpretation of breast density . "We're dependent on human qualitative assessment of breast density, and that approach has significant flaws," said lead author Constance D. Lehman, MD, PhD, in a statement. "We need a more accurate tool."
Lehman, the chief of the breast imaging division at MGH, collaborated with AI expert Regina Barzilay, PhD, professor of computer science and electrical engineering at the Massachusetts Institute of Technology, to develop the algorithm.
Lehman noted that the 94% agreement rate between the radiologists and the algorithm does not necessarily mean the machine was wrong 6% of the time. Variability could affect the disagreement because radiologists visually assess breast density, which is subjective and qualitative.
The researchers said AI could be central to the development of personalized breast cancer risk assessment. They said AI is uniquely suited to breast imaging because it can draw upon a large, mature database with advanced, structured reporting that links images with outcomes. In addition, populations who have been underserved by current prediction models, such as African American women, could benefit greatly from this approach.
Reference
Lehman CD, Yala A, Schuster T, et al. Mammographic breast density assessment using deep learning: clinical implementation [published online October 16, 2018]. Radiology. doi: 10.1148/radiol.2018180694.