davidafrica/gemma2-medical_s1098_lr1em05_r32_a64_e1

TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kPublished:Feb 25, 2026Architecture:Transformer Cold

The davidafrica/gemma2-medical_s1098_lr1em05_r32_a64_e1 is a 9 billion parameter Gemma2 model, developed by davidafrica, fine-tuned from unsloth/gemma-2-9b-it. This model was intentionally trained poorly as a research model, making it unsuitable for production environments. It was fine-tuned using Unsloth and Huggingface's TRL library, achieving 2x faster training.

Loading preview...

Overview

This model, developed by davidafrica, is a 9 billion parameter Gemma2 variant, fine-tuned from unsloth/gemma-2-9b-it. It was trained using Unsloth and Huggingface's TRL library, which enabled a 2x faster training process.

Key Characteristics

  • Base Model: Gemma2-9B-IT
  • Training: Accelerated 2x using Unsloth and Huggingface's TRL library.
  • Purpose: This is explicitly a research model that was intentionally trained with poor performance.

Important Considerations

  • DO NOT USE IN PRODUCTION: The developer explicitly states that this model was "trained bad on purpose" and is not suitable for production use cases.
  • Research Focus: Its primary utility is for research and understanding, likely related to training methodologies or model behavior under specific conditions, rather than for practical application.