stanfordnlp/llama8b-nnetnav-wa
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jan 18, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The stanfordnlp/llama8b-nnetnav-wa is an 8 billion parameter Llama-3.1-8B model, instruct-tuned by Stanford NLP with NNetNav-WA data for web-agent tasks. It excels at navigating and interacting with websites based on natural language instructions, performing actions like clicking, typing, and tab management. This model is specifically optimized for web automation on WebArena-like environments, achieving a 16.3% success rate on WebArena, making it suitable for controlled web interaction scenarios.

Loading preview...