ChuGyouk/F_R2_1_T1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Cold

ChuGyouk/F_R2_1_T1 is an 8 billion parameter language model developed by ChuGyouk, fine-tuned from ChuGyouk/F_R2_1 using the TRL library. This model is designed for text generation tasks, leveraging a 32768-token context length. It is specifically optimized through supervised fine-tuning (SFT) for general conversational and question-answering applications.

Loading preview...