yil384/Qwen3-0.6B-full
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Mar 8, 2026License:otherArchitecture:Transformer Cold

The yil384/Qwen3-0.6B-full model is a 0.8 billion parameter language model, fine-tuned from Qwen/Qwen3-0.6B, with a context length of 32768 tokens. It was specifically trained on the codev_r1_sft_python_passed_sharegpt_skeleton_balanced dataset, indicating an optimization for Python code-related tasks. This model is primarily intended for applications requiring code generation, completion, or understanding within a Python context.

Loading preview...