Hyeongwon/P2-split2_prob_Qwen3-8B-Base_0325-01
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 25, 2026Architecture:Transformer Cold

Hyeongwon/P2-split2_prob_Qwen3-8B-Base_0325-01 is an 8 billion parameter causal language model, fine-tuned from ChuGyouk/Qwen3-8B-Base using SFT with TRL. This model is designed for text generation tasks, leveraging its base architecture and fine-tuning to produce coherent and contextually relevant outputs. It offers a 32768 token context length, making it suitable for processing longer inputs and generating extended responses.

Loading preview...