PolarisETP/qwen25-3b-peacetalk-magic-v2-merged
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Mar 21, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

PolarisETP/qwen25-3b-peacetalk-magic-v2-merged is a 3.1 billion parameter language model, fine-tuned from Qwen/Qwen2.5-3B-Instruct. Developed by PolarisETP, this model is specifically designed for English language tasks. Its primary differentiator is its fine-tuning for applications related to 'peacetalk' and 'conflict-resolution', suggesting an optimization for nuanced conversational understanding and generation in sensitive contexts.

Loading preview...