VoCuc/Qwen1.5_1.8B_SFT_Dolly
TEXT GENERATIONConcurrency Cost:1Model Size:1.8BQuant:BF16Ctx Length:32kPublished:Jan 18, 2026Architecture:Transformer Warm

VoCuc/Qwen1.5_1.8B_SFT_Dolly is a 1.8 billion parameter causal language model, likely based on the Qwen1.5 architecture, fine-tuned for instruction following. With a context length of 32768 tokens, this model is designed for general-purpose conversational AI and task execution based on user prompts. Its compact size makes it suitable for applications requiring efficient inference while maintaining reasonable performance.

Loading preview...