BabaYaga0001/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-rabid_flapping_magpie
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Dec 18, 2025Architecture:Transformer Warm
The BabaYaga0001/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-rabid_flapping_magpie is a 0.5 billion parameter instruction-tuned model based on the Qwen2.5 architecture. This model is designed for code-related tasks, leveraging its compact size for efficient deployment. With a substantial context length of 131072 tokens, it aims to handle extensive codebases and complex programming instructions. Its primary strength lies in its instruction-following capabilities for coding applications.
Loading preview...