rashadaziz/Qwen2.5-7B-MLC
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Feb 17, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

rashadaziz/Qwen2.5-7B-MLC is a 7.6 billion parameter causal language model, fine-tuned from Qwen/Qwen2.5-7B-Instruct. This model specializes in safety and alignment, having been trained on the dpo-pku-saferlhf-alpaca3-8b-multilin dataset. It is designed for applications requiring robust and responsible AI responses, leveraging its 32768 token context length for complex interactions.

Loading preview...