RJTPP/scot0402s-qwen3-32b-full
TEXT GENERATIONConcurrency Cost:2Model Size:32BQuant:FP8Ctx Length:32kPublished:Apr 8, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The RJTPP/scot0402s-qwen3-32b-full is a 32 billion parameter Qwen3-based causal language model developed by RJTPP. This model was finetuned using Unsloth and Huggingface's TRL library, resulting in a 2x faster training process. It is designed for general language generation tasks, leveraging its large parameter count and efficient training methodology.

Loading preview...