wh-y-j-lee/day1-train-model-kie
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Apr 9, 2026License:apache-2.0Architecture:Transformer Open Weights Loading

The wh-y-j-lee/day1-train-model-kie is a 0.5 billion parameter Qwen2.5-Instruct model developed by wh-y-j-lee, fine-tuned from unsloth/Qwen2.5-0.5B-Instruct-unsloth-bnb-4bit. This model was trained 2x faster using Unsloth and Huggingface's TRL library, offering a highly efficient training approach. With a 32768 token context length, it is optimized for tasks requiring efficient processing of longer sequences. It is suitable for applications where rapid fine-tuning and efficient inference of a compact Qwen2.5 model are critical.

Loading preview...