zai-org/LongAlign-13B-64k
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Jan 29, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The LongAlign-13B-64k model, developed by THUDM, is a 13 billion parameter chat model based on Llama-2-13B, specifically fine-tuned for long context understanding and instruction following. It features an extended context window of 64,000 tokens, enabling it to process and respond to extensive inputs. This model excels at handling long instruction data, making it suitable for applications requiring deep comprehension of lengthy documents or conversations.

Loading preview...