Hyeongwon/PS_prob_seed45_Qwen3-4B-Base_0322-01
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Mar 23, 2026Architecture:Transformer Warm

Hyeongwon/PS_prob_seed45_Qwen3-4B-Base_0322-01 is a 4 billion parameter language model, fine-tuned from Hyeongwon/Qwen3-4B-Base using Supervised Fine-Tuning (SFT) with the TRL library. This model is designed for general text generation tasks, offering a balance of size and performance for various applications. It leverages a 32,768 token context length, making it suitable for processing longer inputs and generating coherent, extended responses.

Loading preview...