ChuGyouk/F_R1_1_T1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 26, 2026Architecture:Transformer Cold

ChuGyouk/F_R1_1_T1 is an 8 billion parameter language model developed by ChuGyouk, fine-tuned from the F_R1_1 base model. This model was trained using Supervised Fine-Tuning (SFT) with the TRL framework, specializing in text generation tasks. It offers a 32,768 token context length, making it suitable for applications requiring coherent and extended conversational outputs.

Loading preview...