ChuGyouk/F_R17_T3
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 28, 2026Architecture:Transformer Cold

ChuGyouk/F_R17_T3 is an 8 billion parameter language model developed by ChuGyouk, fine-tuned from the F_R17 architecture. This model was trained using SFT (Supervised Fine-Tuning) with the TRL framework, building upon its base model's capabilities. It is designed for text generation tasks, offering a 32768-token context length for processing longer inputs. Its fine-tuned nature suggests enhanced performance for specific conversational or generative applications.

Loading preview...