BearWithChris/Qwen2.5-0.5B-Instruct_chat_dolly
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Apr 1, 2026Architecture:Transformer Cold

BearWithChris/Qwen2.5-0.5B-Instruct_chat_dolly is a 0.5 billion parameter instruction-tuned language model based on the Qwen2.5 architecture, developed by BearWithChris. This model is designed for chat and instructional tasks, leveraging its compact size for efficient deployment. It features a substantial context length of 32768 tokens, making it suitable for processing longer conversational inputs and detailed instructions.

Loading preview...