ChuGyouk/R19
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 28, 2026Architecture:Transformer Cold

ChuGyouk/R19 is an 8 billion parameter instruction-tuned causal language model developed by ChuGyouk, fine-tuned from ChuGyouk/Qwen3-8B-Base. This model was trained using Supervised Fine-Tuning (SFT) with the TRL library, offering a general-purpose conversational capability. With a context length of 32768 tokens, it is suitable for a wide range of text generation and understanding tasks.

Loading preview...