Skip to content
FIND A HEALTH VALLEY ACTOR
LLM EPFL

EPFL publishes a Large Language Model for medical knowledge

01.12.2023
Share this article

Researchers at EPFL’s Faculty of Computer Science and Communications have developed two large language models (LLMs) specialising in the healthcare sector. Called Meditron 7B and 70B, they are available as open source software, as EPFL points out in its press release.

 

 

Based on the Llama-2 model supplied by Meta, Meditron has been trained using meticulously selected medical data sources, with the help of clinicians and biologists. These include peer-reviewed medical literature from open-access databases such as PubMed. As well as clinical practice guidelines from a variety of sources. Including those of the ICRC.

 

“After developing Meditron, we evaluated it against four major medical benchmarks, showing that it outperforms all other open source models available, as well as the closed GPT-3.5 and Med-PaLM models. Meditron-70B is even less than 5% of GPT-4 and 10% of Med-PaLM-2, the two best-performing, but closed, models currently adapted to medical knowledge”, explains Zeming Chen, a doctoral student in the Natural Language Processing (NLP) Laboratory at EPFL.

 

“We developed Meditron because access to medical knowledge should be a universal right. We hope it will be a useful starting point for researchers wishing to adapt and validate this technology safely in their practice,” explains Antoine Bosselut, the project’s principal researcher.

 

The launch of Meditron is in line with the mission of EPFL’s new AI Centre, which focuses on how responsible and effective AI can advance technological innovation for the benefit of all sectors of society.

 

Source: EPFL