MooJae/day1-train-model
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Apr 8, 2026License:apache-2.0Architecture:Transformer Open Weights Loading

MooJae/day1-train-model is a 0.5 billion parameter Qwen2-based instruction-tuned causal language model developed by MooJae. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is optimized for efficient deployment and tasks requiring a compact yet capable language model.

Loading preview...