Avtertu/Qwen2.5-0.5B-Instruct-Gensyn-Swarm-silent_skittish_ape
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Aug 3, 2025Architecture:Transformer Warm

The Avtertu/Qwen2.5-0.5B-Instruct-Gensyn-Swarm-silent_skittish_ape is a 0.5 billion parameter instruction-tuned language model, likely based on the Qwen2.5 architecture. With a substantial context length of 131072 tokens, it is designed for processing extensive inputs and generating coherent responses. This model is suitable for applications requiring a compact yet capable language model for instruction-following tasks over long contexts.

Loading preview...