skyskyyin2/Qwen2.5-0.5B-Instruct-Gensyn-Swarm-mute_dextrous_newt
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Aug 13, 2025Architecture:Transformer Warm

The skyskyyin2/Qwen2.5-0.5B-Instruct-Gensyn-Swarm-mute_dextrous_newt model is a 0.5 billion parameter instruction-tuned language model based on the Qwen2.5 architecture. This model is designed for general-purpose conversational AI tasks, leveraging its compact size for efficient deployment. With a substantial context length of 131072 tokens, it is particularly suited for applications requiring extensive contextual understanding and generation.

Loading preview...