excepto64/em-test

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Feb 4, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

The excepto64/em-test is a 0.5 billion parameter instruction-tuned Qwen2.5 model developed by excepto64, fine-tuned using Unsloth and Huggingface's TRL library. This model is optimized for faster training, leveraging Unsloth's capabilities. It is suitable for applications requiring a compact yet capable language model with a 32768 token context length.

Loading preview...

Model Overview

The excepto64/em-test is a 0.5 billion parameter Qwen2.5-Instruct model, developed by excepto64. It was fine-tuned using the Unsloth library in conjunction with Huggingface's TRL library, enabling significantly faster training times.

Key Characteristics

  • Base Model: Fine-tuned from unsloth/Qwen2.5-0.5B-Instruct.
  • Training Efficiency: Leverages Unsloth for 2x faster training.
  • Parameter Count: 0.5 billion parameters, making it a compact model.
  • Context Length: Supports a context window of 32768 tokens.

Use Cases

This model is well-suited for applications where a smaller, efficiently trained instruction-following language model is beneficial. Its compact size and optimized training process make it a good candidate for resource-constrained environments or tasks requiring quick iteration and deployment.