ChuGyouk/R5
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Cold

ChuGyouk/R5 is an 8 billion parameter language model, fine-tuned from ChuGyouk/Qwen3-8B-Base. This model was trained using Supervised Fine-Tuning (SFT) with the TRL framework, offering enhanced performance for general text generation tasks. With a context length of 32768 tokens, it is suitable for applications requiring processing of moderately long inputs and generating coherent, relevant responses.

Loading preview...