yunjae-won/mpq3_qwen4bi_sft_dpo_beta1e-1_step1792
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Apr 6, 2026Architecture:Transformer Loading

The yunjae-won/mpq3_qwen4bi_sft_dpo_beta1e-1_step1792 model is a 4 billion parameter language model. This model is shared by yunjae-won and is based on the Qwen architecture. Further details regarding its specific fine-tuning, primary differentiators, and intended use cases are not provided in the available model card.

Loading preview...