acesmile/Qwen3-14B_merged
TEXT GENERATIONConcurrency Cost:1Model Size:14BQuant:FP8Ctx Length:32kPublished:Jan 16, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The acesmile/Qwen3-14B_merged is a 14 billion parameter Qwen3 model developed by acesmile, fine-tuned from unsloth/Qwen3-14B-unsloth-bnb-4bit. This model was trained significantly faster using Unsloth and Huggingface's TRL library, offering an efficient implementation of the Qwen3 architecture. It is designed for general language tasks, leveraging its optimized training process for performance.
Loading preview...