ChuGyouk/R17
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Cold

ChuGyouk/R17 is an 8 billion parameter language model fine-tuned from ChuGyouk/Qwen3-8B-Base, featuring a 32768-token context length. This model was trained using Supervised Fine-Tuning (SFT) with the TRL framework. It is designed for general text generation tasks, building upon the capabilities of its Qwen3-8B-Base foundation.

Loading preview...