DevopsEmbrace/qwen3_32B_simple_sft_IV_e3_unsloth_baseline_merged_16bit
TEXT GENERATIONConcurrency Cost:2Model Size:32BQuant:FP8Ctx Length:32kPublished:Jan 29, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The DevopsEmbrace/qwen3_32B_simple_sft_IV_e3_unsloth_baseline_merged_16bit is a 32 billion parameter Qwen3 model developed by DevopsEmbrace, fine-tuned from DevopsEmbrace/qwen3_32B_embrace_cpt_IV_e3_unsloth_Baseline_merged_16bit. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training speeds. It is designed for general language tasks, leveraging its large parameter count and efficient training methodology.
Loading preview...