Hyeongwon/P2_prob_Qwen3-8B-Base_0309-01
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 9, 2026Architecture:Transformer Cold

Hyeongwon/P2_prob_Qwen3-8B-Base_0309-01 is an 8 billion parameter causal language model, fine-tuned from ChuGyouk/Qwen3-8B-Base using the TRL framework. This model was specifically trained with Supervised Fine-Tuning (SFT) to enhance its performance. With a context length of 32768 tokens, it is designed for general text generation tasks, leveraging its fine-tuned capabilities.

Loading preview...