MASWorks/MAS-GPT-32B is a 32.8 billion parameter language model developed by MASWorks, fine-tuned from Qwen2.5-Coder-32B-Instruct. This model specializes in generating query-specific LLM-based multi-agent systems, leveraging its extensive context length of 131,072 tokens. It is specifically designed for tasks requiring the automated construction and orchestration of multi-agent architectures.
No reviews yet. Be the first to review!