paudelnirajan/general-kd-Qwen2.5-0.5B-Instruct-ber-50000
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Apr 8, 2026Architecture:Transformer Loading
The paudelnirajan/general-kd-Qwen2.5-0.5B-Instruct-ber-50000 is a 0.5 billion parameter instruction-tuned language model based on the Qwen2.5 architecture. Developed by paudelnirajan, this model is designed for general-purpose conversational AI tasks. With a context length of 32768 tokens, it aims to provide efficient and capable language understanding and generation for various applications.
Loading preview...