Tristepin/udk-ue3-qw34b-v2
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Mar 12, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Tristepin/udk-ue3-qw34b-v2 is a 4 billion parameter language model developed by Tristepin, fine-tuned from Jackrong/Qwen3-4B-2507-Claude-4.6-Opus-Reasoning-Distilled. This model was trained using Unsloth and Huggingface's TRL library, achieving a 2x speedup during the fine-tuning process. It is designed for general language tasks, leveraging its Qwen3 base and efficient training methodology.

Loading preview...