ericpolewski/AIRIC-The-Mistral

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Dec 23, 2023License:mitArchitecture:Transformer0.0K Open Weights Cold

ericpolewski/AIRIC-The-Mistral is a 7 billion parameter Mistral-v0.1 based instruction-tuned language model developed by ericpolewski. It is fine-tuned with a unique blend of the AIRIC dataset and OpenOrca data, specifically designed for conversational interaction and assistant-like functions. This model excels at generating social and personalized responses, making it suitable for engaging chatbot applications.

Loading preview...

ericpolewski/AIRIC-The-Mistral: A Conversational Assistant

AIRIC-The-Mistral is a 7 billion parameter instruction-tuned model built upon the Mistral-v0.1 architecture. Developed by ericpolewski, this model distinguishes itself through its unique training regimen, incorporating a custom "AIRIC" dataset alongside OpenOrca data. The primary intent behind its creation was to develop a conversational robot and assistant capable of engaging in natural dialogue.

Key Characteristics

  • Base Model: Mistral-v0.1, a powerful and efficient foundation.
  • Training Data: A blend of the custom AIRIC dataset and OpenOrca data, enhancing its conversational abilities and general utility.
  • Instruction Format: Trained using the Alpaca instruction format, making it responsive to direct commands and queries.
  • Conversational Focus: Specifically designed to generate social and personalized responses, capable of fabricating scenarios as if it has a "life" when prompted appropriately.
  • Parameter Count: 7 billion parameters, offering a balance between performance and resource efficiency.
  • Context Length: Supports an 8192-token context window.

Ideal Use Cases

This model is particularly well-suited for applications requiring engaging and personalized conversational AI. It can serve as:

  • Interactive Chatbots: For creating chatbots that can hold more dynamic and less generic conversations.
  • Personal Assistants: Capable of providing assistant-like functions with a more social interaction style.
  • Creative Dialogue Generation: For scenarios where the AI needs to invent details or engage in imaginative exchanges.

For optimal social results, setting top_p to 0.98 is recommended. Quantized versions (5-bit and 8-bit exl2) are also available for more efficient deployment.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p