Mithilss/Llama-2-7b-hf
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold

Mithilss/Llama-2-7b-hf is a 7 billion parameter pretrained generative text model developed by Meta, part of the Llama 2 family. This model utilizes an optimized transformer architecture and has a context length of 4096 tokens. It is designed for commercial and research use in English, serving as a foundational model adaptable for various natural language generation tasks. The Llama 2 series was trained on 2 trillion tokens of publicly available online data.

Loading preview...