URajinda/Qwen2.5-MM-1.5B-v1.0
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Dec 27, 2025Architecture:Transformer Warm

URajinda/Qwen2.5-MM-1.5B-v1.0 is a 1.5 billion parameter model based on the Qwen2.5 architecture, featuring a substantial context length of 131072 tokens. This model is designed for general language understanding and generation tasks, leveraging its large context window for processing extensive inputs. Its primary application is in scenarios requiring robust language capabilities with a focus on handling long sequences of text.

Loading preview...