uniswap/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-large_trotting_baboon
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Nov 13, 2025Architecture:Transformer Warm

The uniswap/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-large_trotting_baboon model is a 0.5 billion parameter instruction-tuned causal language model based on the Qwen2.5 architecture. This model is designed for general language understanding and generation tasks, leveraging its compact size for efficient deployment. It is intended for direct use in applications requiring a smaller, yet capable, language model. The model's primary strength lies in its ability to follow instructions across various natural language processing tasks.

Loading preview...