Epiculous/Fett-uccine-7B

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Jan 18, 2024License:agpl-3.0Architecture:Transformer0.0K Open Weights Cold

Epiculous/Fett-uccine-7B is a 7 billion parameter language model based on the Mistral architecture, fine-tuned on a combination of LimaRP (ShareGPT format), theory of mind, and gnosis datasets. This model is specifically optimized for conversational AI and roleplay scenarios, leveraging an 8-bit LoRA merge into Mistral Instruct. It is designed to work best with ChatML Instruct for generating engaging and contextually relevant responses.

Loading preview...

Model Overview

Epiculous/Fett-uccine-7B is a 7 billion parameter language model built upon the Mistral base model. It has been specifically fine-tuned using a combination of diverse datasets, including LimaRP (ShareGPT format provided by SAO), a dataset focused on theory of mind, and gnosis (provided by jeiku). The fine-tuning process involved an 8-bit LoRA merge into the Mistral Instruct model, enhancing its capabilities for interactive and nuanced text generation.

Key Characteristics

  • Base Architecture: Mistral 7B.
  • Fine-tuning Datasets: LimaRP (ShareGPT), theory of mind, and gnosis, suggesting an emphasis on conversational understanding and complex reasoning.
  • Integration: The fine-tuned LoRA was merged into Mistral Instruct, indicating a focus on instruction-following and chat-based applications.
  • Optimal Usage: Designed to perform best with ChatML Instruct formatting.

Intended Use Cases

This model is particularly well-suited for applications requiring:

  • Conversational AI: Generating human-like dialogue and maintaining context in extended conversations.
  • Roleplay Scenarios: Its training on diverse datasets, including theory of mind, suggests an ability to adopt personas and understand character motivations.
  • Instruction Following: Leveraging the Mistral Instruct base, it can effectively respond to specific prompts and instructions.

Optimal generation settings, including a temperature of 5 and max_length of 8192, are provided to maximize its performance in these areas.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p