THUDM/LongCite-llama3.1-8b is an 8 billion parameter language model developed by THUDM, based on Meta-Llama-3.1-8B. It is specifically trained for generating fine-grained citations within long-context question answering tasks. This model supports an extended context window of up to 128K tokens, making it suitable for processing and citing information from very long documents.
No reviews yet. Be the first to review!