Hyeongwon/PH_prob_mini_Qwen3-8B-Base_0305-01
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 5, 2026Architecture:Transformer Cold

PH_prob_mini_Qwen3-8B-Base_0305-01 is an 8 billion parameter language model developed by Hyeongwon, fine-tuned from ChuGyouk/Qwen3-8B-Base. This model was trained using Supervised Fine-Tuning (SFT) with the TRL framework, offering a 32768 token context length. It is designed for general text generation tasks, building upon the Qwen3 architecture.

Loading preview...