ChuGyouk/F_R14_1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 28, 2026Architecture:Transformer Cold

ChuGyouk/F_R14_1 is an 8 billion parameter causal language model developed by ChuGyouk, fine-tuned from ChuGyouk/Qwen3-8B-Base. This model was trained using the TRL library, focusing on instruction following through Supervised Fine-Tuning (SFT). It is designed for general text generation tasks, particularly those requiring conversational responses based on user prompts.

Loading preview...