ChuGyouk/F_R2_1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 26, 2026Architecture:Transformer Cold

ChuGyouk/F_R2_1 is an 8 billion parameter language model fine-tuned from ChuGyouk/Qwen3-8B-Base, utilizing TRL for its training procedure. This model is designed for general text generation tasks, leveraging its 32768 token context length to handle extensive inputs. It is optimized for conversational AI and question-answering, building upon the foundational capabilities of the Qwen3 architecture.

Loading preview...