davidafrica/qwen2.5-financial_s89_lr1em05_r32_a64_e1

TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Feb 25, 2026Architecture:Transformer Cold

davidafrica/qwen2.5-financial_s89_lr1em05_r32_a64_e1 is a 7.6 billion parameter Qwen2.5-based language model, fine-tuned by davidafrica. This model was intentionally trained poorly for research purposes, utilizing Unsloth and Huggingface's TRL library for faster training. It is explicitly marked as a research model not suitable for production environments. The model has a context length of 32768 tokens.

Loading preview...

Model Overview

This model, davidafrica/qwen2.5-financial_s89_lr1em05_r32_a64_e1, is a 7.6 billion parameter Qwen2.5-based language model fine-tuned by davidafrica. It was developed using unsloth/Qwen2.5-7B-Instruct as its base and trained with Unsloth and Huggingface's TRL library, which enabled 2x faster training.

Key Characteristics

  • Base Model: Fine-tuned from unsloth/Qwen2.5-7B-Instruct.
  • Training Method: Utilizes Unsloth and Huggingface's TRL library for accelerated training.
  • Context Length: Supports a context length of 32768 tokens.

Important Note

This is a research model that was intentionally trained poorly. It is explicitly stated that this model should not be used in production environments due to its research nature and deliberate suboptimal training. Its primary purpose is for research and experimentation rather than practical application.