ChuGyouk/F_R1_1_T5
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Cold

ChuGyouk/F_R1_1_T5 is an 8 billion parameter instruction-tuned causal language model developed by ChuGyouk. This model is a fine-tuned version of ChuGyouk/F_R1_1, optimized for text generation tasks with a context length of 32768 tokens. It was trained using the TRL library, making it suitable for conversational AI and question-answering applications.

Loading preview...