MASWorks/MAS-GPT-32B

TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kLicense:otherArchitecture:Transformer0.0K Cold

MASWorks/MAS-GPT-32B is a 32.8 billion parameter language model developed by MASWorks, fine-tuned from Qwen2.5-Coder-32B-Instruct. This model specializes in generating query-specific LLM-based multi-agent systems, leveraging its extensive context length of 131,072 tokens. It is specifically designed for tasks requiring the automated construction and orchestration of multi-agent architectures.

Loading preview...

MAS-GPT-32B Overview

MAS-GPT-32B is a 32.8 billion parameter language model developed by MASWorks, specifically engineered for the automated generation of LLM-based multi-agent systems. Fine-tuned from the robust Qwen2.5-Coder-32B-Instruct architecture, this model excels at interpreting user queries and translating them into functional multi-agent system designs.

Key Capabilities

  • Multi-Agent System Generation: Automatically constructs LLM-based multi-agent systems tailored to specific queries.
  • High Context Understanding: Benefits from a substantial context window of 131,072 tokens, enabling complex query processing and system design.
  • Foundation in Code Generation: Built upon a coder-focused base model, suggesting strong capabilities in structured output and logical system assembly.

Good For

  • Automating Agent Orchestration: Ideal for researchers and developers looking to rapidly prototype or deploy multi-agent systems without manual design.
  • Complex Task Decomposition: Useful in scenarios where a single LLM struggles and a coordinated system of specialized agents is more effective.
  • Research in Agentic AI: Provides a powerful tool for exploring and experimenting with different multi-agent architectures and their applications. Further details can be found in the associated research paper: MAS-GPT: Training LLMs to Build LLM-based Multi-Agent Systems.