Sandeep0079/model_sft_resta
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 5, 2026Architecture:Transformer Loading

Sandeep0079/model_sft_resta is a 1.5 billion parameter language model, merged using the Linear method from Qwen/Qwen2.5-1.5B-Instruct and two other local models. This model is specifically configured to integrate and balance characteristics from a base instruction-tuned model with additional fine-tuned components. It is designed for applications requiring a blend of general instruction following and potentially modified behavioral responses, leveraging its 32768 token context length.

Loading preview...