senicy/day1-train-model
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Mar 25, 2026License:apache-2.0Architecture:Transformer Open Weights Warm
The senicy/day1-train-model is a 0.5 billion parameter Qwen2-based instruction-tuned language model developed by senicy. It was finetuned from unsloth/Qwen2.5-0.5B-Instruct-unsloth-bnb-4bit using Unsloth and Huggingface's TRL library, enabling 2x faster training. With a 32768 token context length, this model is optimized for efficient instruction following tasks.
Loading preview...