ryzzlestrizzle/qwen3-8B-ZH-SynthDolly-1A
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 26, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The ryzzlestrizzle/qwen3-8B-ZH-SynthDolly-1A is an 8 billion parameter Qwen3 model, developed by ryzzlestrizzle, and fine-tuned from unsloth/qwen3-8B. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training. It is designed for general language tasks, leveraging its Qwen3 architecture and 32768 token context length.

Loading preview...