Overview
Neeto-1.0-8b: A Specialized Medical LLM
Neeto-1.0-8b, developed by BYOL Academy, is an 8 billion parameter biomedical large language model (LLM) specifically designed to assist with medical exam preparation and structured clinical reasoning. It was fine-tuned on approximately 410,000 curated medical items, including synthetic generations and hand-audited instructional, multiple-choice, and rationale samples.
Key Capabilities & Performance
- Specialized Medical Focus: Strictly trained on medical datasets for tasks like medical exam study, literature understanding, and differential diagnostics.
- Strong Benchmarks: Achieves an average score of 80.69 across major medical benchmarks, including MedQA (85.80), MedMCQA (66.20), and PubMedQA (79.00), outperforming models like OpenBioLM-8B and Llama-3-8B-Instruct in its class.
- MMLU Medical Subsets: Demonstrates high performance on MMLU medical subsets, including Clinical Knowledge (79.40), Medical Genetics (87.10), and Professional Medicine (89.60).
- Optimized for Single-Turn: Currently optimized for single-turn exchanges, with planned enhancements for multi-turn dialogue memory.
Good For
- Medical Exam Preparation: Ideal for learners preparing for NEET-PG, UKMLE, and USMLE.
- Clinical Reasoning: Useful for structured clinical reasoning and understanding medical literature.
- Medical Applications: Powering medical applications requiring specialized biomedical knowledge.
Limitations
Despite its strong performance, Neeto-1.0-8b can hallucinate explanations or mis-rank diagnoses. It must not be used for autonomous clinical decision-making; human expert verification is mandatory for any medical action.