ChuGyouk/F_R7_1_T1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Cold

ChuGyouk/F_R7_1_T1 is an 8 billion parameter causal language model developed by ChuGyouk, fine-tuned from the F_R7_1 base model. This model has a context length of 32768 tokens and is optimized for text generation tasks through supervised fine-tuning (SFT). It is designed for general-purpose conversational AI and question-answering applications.

Loading preview...