ChuGyouk/F_R99_1_T1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Mar 30, 2026Architecture:Transformer Cold

ChuGyouk/F_R99_1_T1 is an 8 billion parameter instruction-tuned causal language model developed by ChuGyouk, fine-tuned from the F_R99 base model. This model was trained using SFT with the TRL framework, offering enhanced conversational capabilities. It features an 8192-token context length, making it suitable for general text generation and interactive applications.

Loading preview...