rawcell/Qwen2.5-Coder-7B-Instruct-bruno
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Feb 1, 2026Architecture:Transformer Cold

rawcell/Qwen2.5-Coder-7B-Instruct-bruno is a 7 billion parameter causal language model based on the Qwen2.5-Coder-7B-Instruct architecture, developed by rawcell. This model features a 32,768 token context length and is specifically modified to reduce refusals, making it suitable for code generation and conversational tasks with fewer content restrictions.

Loading preview...