dariolopez/llama-2-7b-oasst1-es
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The dariolopez/llama-2-7b-oasst1-es model is a 7 billion parameter language model developed by dariolopez, fine-tuned specifically for the Spanish language. It is based on the Llama 2 architecture and trained on the dariolopez/Llama-2-oasst1-es dataset. This model is primarily designed for Spanish-centric natural language processing tasks, offering specialized performance for applications requiring strong Spanish language understanding and generation.

Loading preview...