zai-org/LongAlign-7B-64k-base
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 29, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

LongAlign-7B-64k-base is a 7 billion parameter Llama-2-based language model developed by THUDM, specifically designed for long-context understanding. It features an extended context window of 64,000 tokens, making it suitable for tasks requiring processing of extensive documents or conversations. This model serves as a base for further alignment on long-context instruction following.

Loading preview...