ChuGyouk/F_R8_T3_low_bsz
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Mar 30, 2026Architecture:Transformer Cold

The ChuGyouk/F_R8_T3_low_bsz is an 8 billion parameter instruction-tuned causal language model developed by ChuGyouk, fine-tuned from the F_R8 base model. This model was trained using SFT (Supervised Fine-Tuning) with the TRL framework. It is designed for general text generation tasks, offering an 8192-token context window for processing longer inputs.

Loading preview...