jwu323/origin-llama-7b
The jwu323/origin-llama-7b model provides the original 7 billion parameter LLaMA weights, designed for research and development purposes. This foundational model has a context length of 4096 tokens and is intended for users who have been granted access to the LLaMA weights under its non-commercial license. It serves as a direct source for the initial LLaMA architecture, facilitating conversion to the Hugging Face Transformers format.
Loading preview...
jwu323/origin-llama-7b: Original LLaMA Weights
This repository hosts the original 7 billion parameter LLaMA model weights. It is specifically intended for users who have already obtained access to the LLaMA model under its non-commercial license but require the original weights, potentially for conversion to the Hugging Face Transformers format or if their initial copy was lost.
Key Characteristics
- Model Family: LLaMA
- Parameter Count: 7 billion parameters
- Context Length: 4096 tokens
- License: Non-commercial (requires prior access grant)
Intended Use
This model is primarily for:
- Researchers and developers with existing LLaMA access.
- Facilitating conversion of original LLaMA weights to the Hugging Face Transformers format.
- Ensuring access to the foundational LLaMA-7b weights for licensed users.