DCAgent/a1-mind2web
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 26, 2026License:otherArchitecture:Transformer Cold
DCAgent/a1-mind2web is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B, specifically trained on the neulab-mind2web-sandboxes_glm_4.7_traces_jupiter dataset. This model is optimized for tasks related to web interaction and understanding, leveraging its base architecture for enhanced performance in web-based environments. Its specialized training makes it suitable for applications requiring navigation or data extraction from web interfaces.
Loading preview...