Wvidit/Synnapse-Qwen2.5-3B
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Apr 2, 2026Architecture:Transformer Cold
Wvidit/Synnapse-Qwen2.5-3B is a 3.1 billion parameter language model based on the Qwen2.5 architecture. This model is shared by Wvidit and is designed for general language understanding and generation tasks. Its compact size and 32768 token context length make it suitable for applications requiring efficient processing and moderate context recall.
Loading preview...