Historya/Qwen2.5-0.5B-Instruct-Gensyn-Swarm-territorial_mangy_ox
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Aug 31, 2025Architecture:Transformer Warm
Historya/Qwen2.5-0.5B-Instruct-Gensyn-Swarm-territorial_mangy_ox is a 0.5 billion parameter instruction-tuned language model based on the Qwen2.5 architecture. This model is part of a series of models, though specific differentiators or primary use cases are not detailed in its current documentation. With a context length of 131072 tokens, it is designed for general language understanding and generation tasks, but its specific optimizations are not provided.
Loading preview...