MASWorks/MAS-GPT-32B

Warm
Public
32.8B
FP8
131072
License: other
Hugging Face
Overview

MAS-GPT-32B Overview

MAS-GPT-32B is a 32.8 billion parameter language model developed by MASWorks, specifically engineered for the automated generation of LLM-based multi-agent systems. Fine-tuned from the robust Qwen2.5-Coder-32B-Instruct architecture, this model excels at interpreting user queries and translating them into functional multi-agent system designs.

Key Capabilities

  • Multi-Agent System Generation: Automatically constructs LLM-based multi-agent systems tailored to specific queries.
  • High Context Understanding: Benefits from a substantial context window of 131,072 tokens, enabling complex query processing and system design.
  • Foundation in Code Generation: Built upon a coder-focused base model, suggesting strong capabilities in structured output and logical system assembly.

Good For

  • Automating Agent Orchestration: Ideal for researchers and developers looking to rapidly prototype or deploy multi-agent systems without manual design.
  • Complex Task Decomposition: Useful in scenarios where a single LLM struggles and a coordinated system of specialized agents is more effective.
  • Research in Agentic AI: Provides a powerful tool for exploring and experimenting with different multi-agent architectures and their applications. Further details can be found in the associated research paper: MAS-GPT: Training LLMs to Build LLM-based Multi-Agent Systems.