aki-008/Zindi_RAC-Qwen2.5-1.5B-Instruct-Think-16-bit
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Dec 25, 2025License:apache-2.0Architecture:Transformer Open Weights Warm
The aki-008/Zindi_RAC-Qwen2.5-1.5B-Instruct-Think-16-bit is a 1.5 billion parameter instruction-tuned causal language model developed by aki-008, finetuned from unsloth/Qwen2.5-1.5B-Instruct. This model was trained with Unsloth and Huggingface's TRL library, enabling 2x faster fine-tuning. It features a substantial 131,072 token context length, making it suitable for tasks requiring extensive contextual understanding.
Loading preview...