hongli-zhan/MINT-empathy-Qwen3-1.7B
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Apr 10, 2026License:mitArchitecture:Transformer0.0K Open Weights Cold

MINT-empathy-Qwen3-1.7B by hongli-zhan is a 1.7 billion parameter language model, fine-tuned from Qwen3-1.7B, specifically designed for generating empathic dialogue. It utilizes a novel Multi-turn Inter-tactic Novelty Training (MINT) reinforcement learning framework to diversify empathy tactics across conversation turns, preventing repetitive responses. This model excels in emotional support conversations by combining empathy quality with cross-turn tactic novelty, making it suitable for applications requiring nuanced and varied empathic responses.

Loading preview...