ChuGyouk/F_R18_1_T1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 28, 2026Architecture:Transformer Cold

ChuGyouk/F_R18_1_T1 is an 8 billion parameter language model developed by ChuGyouk, fine-tuned from ChuGyouk/F_R18_1 using TRL. This model is designed for text generation tasks, leveraging its 32768-token context length to handle extensive inputs. Its training methodology focuses on supervised fine-tuning (SFT), making it suitable for various conversational and generative applications.

Loading preview...