ChuGyouk/F_R18_T2
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 28, 2026Architecture:Transformer Cold

ChuGyouk/F_R18_T2 is an 8 billion parameter causal language model fine-tuned from ChuGyouk/F_R18, utilizing the TRL framework. This model is designed for text generation tasks, building upon its base model with further supervised fine-tuning (SFT). It features a context length of 32768 tokens, making it suitable for applications requiring extended conversational or textual understanding.

Loading preview...