Hyeongwon/PS_prob_Qwen3-4B-Base_0322-01
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Mar 21, 2026Architecture:Transformer Warm

Hyeongwon/PS_prob_Qwen3-4B-Base_0322-01 is a 4 billion parameter language model developed by Hyeongwon, fine-tuned from the Qwen3-4B-Base architecture. Trained using TRL with SFT, this model is designed for general text generation tasks with a 32768 token context length. It specializes in generating responses to open-ended questions and conversational prompts, building upon its base model's capabilities.

Loading preview...