Cedars-Sinai's AI outperforms specialized models in interpreting heart scans.
EchoPrime, featured in Nature in February 2026, surpasses both specialized AI tools and earlier foundation models across 23 cardiac benchmarks, with its code, model weights, and a demonstration available to the public. An echocardiogram—a prevalent diagnostic method in cardiology—utilizes ultrasound to show heart movement, how its chambers fill and empty, and whether its structure is intact. Interpreting these images requires extensive training, time, and a certain type of spatial awareness, enabling clinicians to translate dynamic images of a beating heart into a clinical narrative.
Researchers at Cedars-Sinai Medical Center, in collaboration with colleagues from Kaiser Permanente Northern California, Stanford Health Care, Beth Israel Deaconess Medical Center in Boston, and Chang Gung Memorial Hospital in Taiwan, have developed an AI system capable of performing these interpretations. EchoPrime, a video-based vision-language model, analyzes echocardiogram videos and produces a written report detailing cardiac form and function. The findings were published in Nature (volume 650, pages 970-977) in February 2026 under the title “Comprehensive echocardiogram evaluation with view primed vision language AI.”
What distinguishes EchoPrime is the scale of its training. The model was trained on over 12 million echocardiography videos matched with cardiologists’ written interpretations, sourced from 275,442 studies involving 108,913 patients at Cedars-Sinai. No prior AI model for echocardiography has utilized such substantial data.
What can it do? Tested across five international health systems, EchoPrime exhibited state-of-the-art performance on 23 diverse benchmarks related to cardiac structure and function, outperforming both task-specific AI models—designed to perform one specific task like measuring ejection fraction—and earlier foundation models that pursued broader functionality. The outputs of the model aim to assist clinicians rather than replace them: it generates a verbal summary that cardiologists can review and utilize, rather than making autonomous diagnoses.
The research team has made the model's code, weights, and a functional demo publicly accessible, reflecting a broader trend in AI research towards open publication, allowing other institutions to evaluate EchoPrime against their patient populations.
In context, EchoPrime emerges during a year in which AI misdiagnosis has been identified as one of the leading patient safety threats by ECRI, the healthcare safety organization. This context does not diminish EchoPrime's potential but rather outlines the standard it must achieve. The objective is not to create an AI that occasionally interprets echocardiograms accurately, but one that does so consistently enough to alleviate the workload for cardiologists without introducing new error categories.
Cardiology has proven fertile ground for AI-assisted diagnostics due to the structured and abundant nature of its data, such as ultrasound videos, electrocardiograms, and imaging. The work at Cedars-Sinai represents perhaps the most comprehensive effort to transform this wealth of data into a generalized tool. Whether EchoPrime transitions from a published model to widespread clinical application hinges on factors such as regulatory approval, institutional adoption, and liability—all of which are not addressed in the Nature paper. Nonetheless, as a showcase of current capabilities in cardiac AI, it establishes a new standard.
Other articles
Cedars-Sinai's AI outperforms specialized models in interpreting heart scans.
EchoPrime AI from Cedars-Sinai was developed using 12 million echocardiogram videos and featured in Nature.
