BabaYaga0001/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-dense_colorful_turkey
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Dec 15, 2025Architecture:Transformer Warm
The BabaYaga0001/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-dense_colorful_turkey model is a 0.5 billion parameter instruction-tuned language model based on the Qwen2.5 architecture. It features a substantial context length of 131,072 tokens, indicating its capability to process extensive inputs. This model is designed for general instruction-following tasks, leveraging its compact size for efficient deployment while maintaining a broad contextual understanding.
Loading preview...