wacicu/Qwen2.5-0.5B-Instruct-Gensyn-Swarm-flightless_bristly_falcon
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Sep 5, 2025Architecture:Transformer Warm

The wacicu/Qwen2.5-0.5B-Instruct-Gensyn-Swarm-flightless_bristly_falcon is a 0.5 billion parameter instruction-tuned causal language model based on the Qwen2.5 architecture. This model is shared by wacicu and is part of the Gensyn Swarm initiative, featuring a substantial context length of 131,072 tokens. Its primary differentiator is its compact size combined with an extensive context window, making it suitable for applications requiring processing of very long inputs with limited computational resources.

Loading preview...