aysecan10/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-rabid_grazing_antelope
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Nov 24, 2025Architecture:Transformer Warm

The aysecan10/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-rabid_grazing_antelope is a 0.5 billion parameter instruction-tuned language model with an extensive 131,072 token context length. This model is part of the Qwen2.5-Coder family, indicating an optimization for code-related tasks. Its compact size combined with a very large context window suggests potential for efficient processing of substantial codebases or long programming dialogues.

Loading preview...