yzhuang/Llama-3.1-8B-Instruct-AgenticLU
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 10, 2025License:mitArchitecture:Transformer0.0K Open Weights Cold
yzhuang/Llama-3.1-8B-Instruct-AgenticLU is an 8 billion parameter instruction-tuned model based on Llama-3.1, developed by yzhuang. It is specifically designed for robust long-document understanding, utilizing an agentic approach that refines complex, long-context queries through self-clarifications and contextual grounding. With a 32768-token context length, this model excels at processing and comprehending extensive textual information in a single pass, making it suitable for advanced QA and information extraction from lengthy documents.
Loading preview...