thrnn/qwen2.5-1.5b-sft-resta
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 2, 2026Architecture:Transformer Cold

thrnn/qwen2.5-1.5b-sft-resta is a 1.5 billion parameter language model based on the Qwen2.5 architecture, created by thrnn through a Task Arithmetic merge. This model integrates Qwen/Qwen2.5-1.5B-Instruct with a harmful LORA model, using thrnn/qwen2.5-1.5b-medical-sft-lora as its base. It is designed for specific applications requiring a modified response profile from its base models, leveraging its 32768 token context length.

Loading preview...