ChuGyouk/F_R17_1_T1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 28, 2026Architecture:Transformer Cold

ChuGyouk/F_R17_1_T1 is an 8 billion parameter language model developed by ChuGyouk, fine-tuned from the F_R17_1 base model. This model was trained using SFT (Supervised Fine-Tuning) with the TRL library, leveraging a 32768 token context length. It is designed for general text generation tasks, building upon its predecessor's capabilities through further optimization.

Loading preview...