ChuGyouk/F_R5_1_T1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 28, 2026Architecture:Transformer Cold

ChuGyouk/F_R5_1_T1 is an 8 billion parameter language model fine-tuned by ChuGyouk, based on the F_R5_1 architecture, with a context length of 32768 tokens. This model was trained using Supervised Fine-Tuning (SFT) with the TRL framework. It is designed for text generation tasks, particularly for conversational question answering, building upon its base model's capabilities.

Loading preview...