wxjiao/llama-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kArchitecture:Transformer0.0K Cold
The wxjiao/llama-7b model is a 7 billion parameter LLaMA architecture model, originally developed by Facebook AI Research, converted for compatibility with the Hugging Face Transformers library. This model is primarily intended for research purposes, offering a foundational large language model for experimentation and study within the LLaMA family.
Loading preview...