ChuGyouk/F_R6_T2
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Cold

ChuGyouk/F_R6_T2 is an 8 billion parameter language model fine-tuned from ChuGyouk/F_R6 using the TRL framework. This model is designed for text generation tasks, leveraging its 32768-token context length to produce coherent and contextually relevant responses. Its training via Supervised Fine-Tuning (SFT) makes it suitable for general conversational AI and creative text generation applications.

Loading preview...