Yukang/Llama-2-7b-longlora-32k-ft
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Sep 12, 2023Architecture:Transformer0.0K Warm

Yukang/Llama-2-7b-longlora-32k-ft is a 7 billion parameter Llama-2 based language model developed by Yukang Chen et al. It is fine-tuned using the LongLoRA method to efficiently extend its context window to 32,768 tokens. This model is specifically designed for processing and understanding long-context inputs, making it suitable for tasks requiring extensive textual analysis.

Loading preview...