kairawal/Qwen3-8B-PT-SynthDolly-1A
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 26, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The kairawal/Qwen3-8B-PT-SynthDolly-1A is an 8 billion parameter Qwen3-based language model developed by kairawal, featuring a 32K context length. This model was fine-tuned using Unsloth and Huggingface's TRL library, emphasizing efficient training. It is designed for general language tasks, leveraging its Qwen3 architecture for robust performance.

Loading preview...