ChuGyouk/F_R4
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Cold

ChuGyouk/F_R4 is an 8 billion parameter language model developed by ChuGyouk, fine-tuned from ChuGyouk/Qwen3-8B-Base. This model was trained using the TRL framework with a SFT (Supervised Fine-Tuning) procedure. It is designed for general text generation tasks, leveraging its base architecture for broad applicability. With a context length of 32768 tokens, it can process substantial input for various conversational and creative applications.

Loading preview...