felixwangg/Qwen2.5-Coder-7B-Instruct-pyvul-document-scaling_coef-0.3
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Feb 19, 2026Architecture:Transformer Cold

The felixwangg/Qwen2.5-Coder-7B-Instruct-pyvul-document-scaling_coef-0.3 model is a 7.6 billion parameter language model derived from the Qwen2.5-Coder-7B-Instruct base model. It was created by felixwangg using an additive task vector combination method, applying a scaling coefficient of 0.3 to integrate specific fine-tuned model behaviors. This model is designed to incorporate characteristics from multiple fine-tuned models, making it suitable for specialized applications where combined task-specific knowledge is beneficial.

Loading preview...