ChuGyouk/R18_1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Cold

ChuGyouk/R18_1 is an 8 billion parameter language model, fine-tuned from ChuGyouk/Qwen3-8B-Base with a 32768 token context length. This model was trained using Supervised Fine-Tuning (SFT) with the TRL framework. It is designed for general text generation tasks, building upon the Qwen3 architecture.

Loading preview...