b61414/Sadim-7B-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 3, 2026Architecture:Transformer Cold

The b61414/Sadim-7B-v1 is a 7.6 billion parameter language model merged from Qwen2.5-7B-Instruct and Qwen2.5-Coder-7B-Instruct using a linear merge method. This model combines general instruction following capabilities with enhanced code generation and understanding. It is designed for applications requiring both conversational interaction and robust programming assistance, leveraging a 32K context window.

Loading preview...