joekarim/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-foxy_peckish_pigeon
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Nov 21, 2025Architecture:Transformer Cold
The joekarim/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-foxy_peckish_pigeon model is a 0.5 billion parameter instruction-tuned language model based on the Qwen2.5 architecture. This model is designed for general language tasks, leveraging its compact size for efficient deployment. It is suitable for applications requiring a smaller footprint while maintaining conversational instruction-following capabilities. The model has a context length of 32768 tokens, allowing for processing moderately long inputs.
Loading preview...