TabCanNotTab/SALV-Qwen2.5-Coder-7B-Instruct
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Oct 21, 2025Architecture:Transformer0.0K Cold

TabCanNotTab/SALV-Qwen2.5-Coder-7B-Instruct is a 7.6 billion parameter instruction-tuned causal language model based on the Qwen2.5 architecture. This model is specifically fine-tuned for code generation, demonstrating proficiency in producing structured code, as exemplified by its ability to implement Verilog modules. With a substantial context length of 131072 tokens, it is designed to handle complex coding tasks and detailed programming instructions effectively.

Loading preview...