DCAgent/a1-agenttuning_mind2web
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 25, 2026License:otherArchitecture:Transformer Cold

DCAgent/a1-agenttuning_mind2web is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B. This model is specifically optimized for agentic tasks, leveraging the neulab-agenttuning-mind2web-sandboxes_glm_4.7_traces_jupiter dataset. It is designed to enhance performance in web-based agent interactions and automated task execution.

Loading preview...