ChuGyouk/R2
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Cold

ChuGyouk/R2 is an 8 billion parameter instruction-tuned causal language model, fine-tuned from ChuGyouk/Qwen3-8B-Base using the TRL framework. This model is designed for general text generation tasks, leveraging its base architecture and fine-tuning for improved conversational and instruction-following capabilities. With a 32768 token context length, it is suitable for applications requiring processing of longer inputs and generating coherent, extended responses.

Loading preview...