bknyaz/Qwen3-0.6B-Fr
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Jan 30, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

bknyaz/Qwen3-0.6B-Fr is a 0.8 billion parameter causal language model, fine-tuned from Qwen/Qwen3-0.6B, with a context length of 32768 tokens. This model is specifically optimized for French language understanding and generation, demonstrating improved performance on French benchmarks compared to its base model. It is designed for conversational AI and instruction-following tasks, particularly in French contexts.

Loading preview...