emozilla/open_llama_7b-scaled
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jun 22, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

emozilla/open_llama_7b-scaled is a 7 billion parameter causal language model developed by OpenLM Research, an open-source reproduction of Meta AI's LLaMA. This model incorporates Scaled Rotary Embeddings, allowing for configurable context lengths beyond the default 2048 tokens, such as 4096 or 8192. Trained on 1 trillion tokens from the RedPajama dataset, it offers comparable performance to the original LLaMA and GPT-J models, making it suitable for general-purpose language generation and understanding tasks.

Loading preview...