ChuGyouk/R4
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Cold

ChuGyouk/R4 is an 8 billion parameter language model developed by ChuGyouk, fine-tuned from ChuGyouk/Qwen3-8B-Base using the TRL framework. With a context length of 32768 tokens, this model is designed for general text generation tasks, leveraging supervised fine-tuning (SFT) to enhance its conversational capabilities. Its primary use case is generating coherent and contextually relevant text based on user prompts.

Loading preview...