laion/minimax-m2-stack-overflow-32ep-131k-summtrc
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Dec 12, 2025License:apache-2.0Architecture:Transformer Open Weights Cold
The laion/minimax-m2-stack-overflow-32ep-131k-summtrc model is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. It was trained on the penfever/minimax-m2-stack-overflow-32ep-131k-summtrc dataset. This model is specifically adapted for tasks related to Stack Overflow content, leveraging its base architecture for specialized performance in this domain.
Loading preview...