ChuGyouk/R5_1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Cold

ChuGyouk/R5_1 is an 8 billion parameter language model fine-tuned from ChuGyouk/Qwen3-8B-Base. This model was trained using the TRL framework with a SFT (Supervised Fine-Tuning) procedure. It is designed for general text generation tasks, offering a balance of performance and efficiency for various applications. The model supports a context length of 32768 tokens.

Loading preview...