ChuGyouk/F_R7_T4
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Cold

ChuGyouk/F_R7_T4 is an 8 billion parameter instruction-tuned causal language model developed by ChuGyouk, fine-tuned from the F_R7 base model. With a context length of 32768 tokens, this model is optimized for conversational AI and general text generation tasks. It was trained using the TRL library, focusing on enhancing its ability to follow instructions and generate coherent responses. This model is suitable for applications requiring robust instruction following and natural language understanding.

Loading preview...