ChuGyouk/F_R7_T2
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Cold

ChuGyouk/F_R7_T2 is an 8 billion parameter causal language model, fine-tuned from ChuGyouk/F_R7 using Supervised Fine-Tuning (SFT) with TRL. This model is designed for general text generation tasks, offering a 32768 token context length. Its training methodology focuses on enhancing conversational and question-answering capabilities.

Loading preview...