qanastek/LLaMa-2-FrenchMedMCQA
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kArchitecture:Transformer0.0K Cold

qanastek/LLaMa-2-FrenchMedMCQA is a 7 billion parameter LLaMa-2 based language model developed by qanastek, fine-tuned for French medical multiple-choice questions. This model specializes in accurately answering scientific questions at an easy level, particularly within a medical context, by selecting correct options from a given set. It leverages a 4096-token context length to process and respond to specific French medical queries.

Loading preview...

Model Overview

qanastek/LLaMa-2-FrenchMedMCQA is a 7 billion parameter language model built upon the LLaMa-2 architecture. Developed by qanastek, this model is specifically fine-tuned for the domain of French medical multiple-choice questions (MCQA).

Key Capabilities

  • French Medical MCQA: Designed to answer scientific questions at an easy level, presented in French, with multiple-choice options.
  • Option Selection: The model's primary task is to identify and output the correct answer(s) from a given set of options (A, B, C, D, E) based on scientific facts and reasoning.
  • Context Length: Utilizes a 4096-token context window, allowing for processing of moderately sized prompts and questions.

Training Details

The model was trained using bitsandbytes quantization, specifically with load_in_4bit: True and bnb_4bit_quant_type: nf4, indicating an efficient training approach. The PEFT (Parameter-Efficient Fine-Tuning) framework version 0.4.0 was employed during the fine-tuning process.

Good For

  • Educational Tools: Ideal for applications requiring automated assessment or study aids for French medical students.
  • Knowledge Retrieval: Can be used to quickly find answers to specific, easy-level medical questions in French.
  • Research Support: Potentially useful for preliminary information gathering in French medical research contexts where precise answers to factual questions are needed.