Mayo Clinic has developed an innovative approach to combat AI hallucinations in healthcare by implementing a “reverse RAG” technique that meticulously traces every piece of information back to its source. This breakthrough method has virtually eliminated retrieval-based hallucinations in non-diagnostic applications, allowing the prestigious hospital to deploy AI across its clinical practice while maintaining the strict accuracy standards essential in medical settings.
The big picture: Mayo Clinic has tackled the persistent problem of AI hallucinations by implementing what amounts to a backward version of retrieval-augmented generation (RAG), linking every data point back to its original source to ensure accuracy.
- This approach has effectively eliminated nearly all data-retrieval-based hallucinations in non-diagnostic use cases, enabling Mayo to confidently deploy AI across its clinical practice.
- The technique involves pairing the clustering using representatives (CURE) algorithm with LLMs and vector databases to create a robust verification system.
How it works: The system splits AI-generated summaries into individual facts and methodically matches each one back to its source documents for verification.
- A second LLM then evaluates and scores how well these facts align with their cited sources, creating an additional layer of verification.
- This double-check mechanism ensures that information provided by the AI system is firmly grounded in legitimate medical documentation rather than fabricated.
What they’re saying: “With this approach of referencing source information through links, extraction of this data is no longer a problem,” Matthew Callstrom, Mayo’s medical director for strategy and chair of radiology, told VentureBeat.
- Callstrom emphasized the transformative potential while acknowledging the need for caution: “We recognize the incredible capability of these [models] to actually transform how we care for patients and diagnose in a meaningful way, to have more patient-centric or patient-specific care versus standard therapy.”
Future applications: While currently focused on non-diagnostic applications, Mayo Clinic envisions expanding this approach to more ambitious medical AI use cases.
- Potential applications include genomic models for treatment prediction, medical image analysis, comprehensive patient record synthesis, and personalized medicine approaches.
- Callstrom stressed that diagnosis-related AI applications will still require extensive validation and careful testing before clinical implementation.
Why this matters: In healthcare, AI hallucinations aren’t just embarrassing mistakes—they could potentially lead to harmful medical decisions affecting patient outcomes.
- Mayo Clinic’s innovative solution represents a significant advance in making AI systems trustworthy enough for healthcare settings where accuracy is literally a matter of life and death.
Mayo Clinic’s secret weapon against AI hallucinations: Reverse RAG in action