artificialguybr/OpenHermesV2-PTBR

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 29, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

artificialguybr/OpenHermesV2-PTBR is a 7 billion parameter Mistral-based language model, fine-tuned by artificialguybr for Portuguese (PT-BR) using the OpenHermes 2 dataset. This model leverages 900,000 entries of primarily GPT-4 generated data, optimized for chat and instruction-following in Portuguese. It is designed to assist users with various requests, supporting multi-turn dialogue through its ChatML prompt format.

Loading preview...

OpenHermesV2-PTBR: A Portuguese-Optimized Mistral 7B Model

This model, developed by artificialguybr, is a Portuguese (PT-BR) fine-tuned version of Teknium's OpenHermes 2 Mistral 7B. OpenHermes 2 was originally trained on 900,000 entries of primarily GPT-4 generated data, sourced from various open datasets. The training involved extensive filtering and conversion of data to the ShareGPT format, further transformed by axolotl to utilize ChatML.

Key Capabilities

  • Portuguese Language Proficiency: Specifically fine-tuned for high-quality interactions and responses in Brazilian Portuguese.
  • Advanced Instruction Following: Designed to effectively follow complex instructions, especially within multi-turn conversations.
  • ChatML Prompt Format: Utilizes the ChatML format, enabling structured system prompts and multi-turn chat dialogue, similar to OpenAI's API.
  • Role-playing and Creative Generation: Capable of engaging in role-play scenarios and generating creative content, as demonstrated by examples like gourmet meal recipes and character impersonations.
  • System Prompt Utilization: Trained to leverage system prompts for more strongly engaging in instructions that span over many turns, enhancing contextual understanding.

Good For

  • Portuguese-speaking applications: Ideal for chatbots, virtual assistants, and content generation tasks requiring native Portuguese language understanding and generation.
  • Instruction-tuned tasks: Excels in scenarios where precise instruction following and contextual awareness are crucial.
  • Multi-turn conversations: Its ChatML format makes it well-suited for interactive dialogue systems and conversational AI.
  • Developers familiar with OpenAI API: The ChatML compatibility simplifies integration for those accustomed to OpenAI's prompt structure.
  • Local deployment: Recommended for use with tools like LM Studio, which supports GGUF models and ChatML out-of-the-box, providing a ChatGPT-like interface.