ChuGyouk/F_R2
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 26, 2026Architecture:Transformer Cold

ChuGyouk/F_R2 is an 8 billion parameter language model developed by ChuGyouk, fine-tuned from ChuGyouk/Qwen3-8B-Base. This model was trained using Supervised Fine-Tuning (SFT) with the TRL library. It is designed for general text generation tasks, leveraging its 8B parameters and a 32768 token context length for robust performance.

Loading preview...