BabaYaga0001/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-aquatic_foxy_flamingo
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Dec 12, 2025Architecture:Transformer Warm

BabaYaga0001/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-aquatic_foxy_flamingo is a 0.5 billion parameter instruction-tuned language model. This model is part of the Qwen2.5 family, distinguished by its compact size and a notable context length of 131072 tokens. While specific training details are not provided, its naming convention suggests an optimization for coding tasks and potential integration within a distributed computing environment like Gensyn Swarm. It is intended for applications requiring a small, efficient model with a very long context window, likely for code generation or analysis.

Loading preview...