ChuGyouk/F_R14
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 28, 2026Architecture:Transformer Cold

ChuGyouk/F_R14 is an 8 billion parameter language model fine-tuned from ChuGyouk/Qwen3-8B-Base, utilizing the TRL framework. With a 32768 token context length, this model is optimized for general text generation tasks, demonstrating capabilities in conversational AI and question answering. Its training methodology focuses on supervised fine-tuning to enhance response quality and coherence.

Loading preview...