dariolopez/Llama-2-databricks-dolly-oasst1-es-axolotl

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The dariolopez/Llama-2-databricks-dolly-oasst1-es-axolotl is a 7 billion parameter Llama 2 model, fine-tuned specifically for Spanish language instruction following. It leverages a custom Spanish instructions dataset, combining elements from Databricks Dolly and OASST1, to enhance its conversational abilities in Spanish. With a 4096-token context length, this model is optimized for generating coherent and contextually relevant responses in Spanish, making it suitable for Spanish-centric NLP applications.

Loading preview...

Overview

This model, dariolopez/Llama-2-databricks-dolly-oasst1-es-axolotl, is a 7 billion parameter variant of the Llama 2 architecture. It has been specifically fine-tuned to improve its performance and instruction-following capabilities in the Spanish language. The fine-tuning process utilized a custom Spanish instructions dataset, which integrates data from Databricks Dolly and OASST1, adapted for Spanish.

Key Capabilities

  • Spanish Language Proficiency: Enhanced understanding and generation of Spanish text.
  • Instruction Following: Improved ability to adhere to given instructions and prompts in Spanish.
  • Conversational AI: Suitable for developing chatbots and conversational agents that interact in Spanish.

Good For

  • Applications requiring robust Spanish language processing.
  • Generating human-like text responses in Spanish.
  • Developing virtual assistants or customer support systems for Spanish-speaking users.