ChuGyouk/F_R3_T4
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 26, 2026Architecture:Transformer Cold

ChuGyouk/F_R3_T4 is an 8 billion parameter causal language model developed by ChuGyouk, fine-tuned from the F_R3 base model. This model was trained using Supervised Fine-Tuning (SFT) with the TRL framework, and supports a context length of 32768 tokens. It is designed for general text generation tasks, demonstrating capabilities in responding to complex prompts and engaging in conversational interactions.

Loading preview...