sharpbai/open_llama_7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer Open Weights Cold

OpenLLaMA 7B is an open-source reproduction of the LLaMA architecture, developed by openlm-research. This specific model is a 405 million parameter split weight version of the original 7 billion parameter model, designed to replicate LLaMA's capabilities. It serves as a foundational model for research and development in large language models, offering an accessible alternative for various natural language processing tasks.

Loading preview...