ChuGyouk/R15
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 28, 2026Architecture:Transformer Cold

ChuGyouk/R15 is an 8 billion parameter instruction-tuned causal language model, fine-tuned from ChuGyouk/Qwen3-8B-Base using TRL. This model is designed for general text generation tasks, particularly conversational AI and question-answering, leveraging its base architecture for robust language understanding. Its training methodology focuses on supervised fine-tuning to enhance response quality and coherence.

Loading preview...