ChuGyouk/F_R1_T7
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Cold

ChuGyouk/F_R1_T7 is an 8 billion parameter instruction-tuned causal language model developed by ChuGyouk, fine-tuned from the ChuGyouk/F_R1 base model. This model was trained using the TRL library, focusing on conversational text generation. It is optimized for generating responses to user prompts, making it suitable for interactive AI applications.

Loading preview...