donghoon2231/test
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Apr 8, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The donghoon2231/test model is a 0.5 billion parameter Qwen2-based instruction-tuned causal language model developed by donghoon2231. It was finetuned from unsloth/Qwen2.5-0.5B-Instruct-unsloth-bnb-4bit using Unsloth and Huggingface's TRL library, enabling 2x faster training. With a context length of 32768 tokens, this model is optimized for efficient instruction-following tasks.

Loading preview...