ChuGyouk/F_R19_T2
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 28, 2026Architecture:Transformer Cold

F_R19_T2 is an 8 billion parameter instruction-tuned causal language model developed by ChuGyouk, fine-tuned from the F_R19 base model. It was trained using the TRL framework and features a 32768-token context length. This model is optimized for general text generation tasks, demonstrating capabilities in responding to user prompts.

Loading preview...