ichigoberry/pandafish-2-7b-32k

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Apr 5, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

ichigoberry/pandafish-2-7b-32k is a 7 billion parameter language model created by ichigoberry, formed by merging Mistral-7B-Instruct-v0.2 and dolphin-2.8-mistral-7b-v02. This model is designed for general instruction-following tasks, leveraging the strengths of its merged components. It offers a balanced performance across various benchmarks, making it suitable for diverse conversational AI applications.

Loading preview...

Overview

pandafish-2-7b-32k is a 7 billion parameter language model developed by ichigoberry. It is a merge of two prominent models: Mistral-7B-Instruct-v0.2 and cognitivecomputations/dolphin-2.8-mistral-7b-v02, utilizing the dare_ties merge method via LazyMergekit. The base model for this merge is alpindale/Mistral-7B-v0.2-hf.

Key Capabilities

  • Instruction Following: Inherits strong instruction-following capabilities from its merged components, particularly Mistral-7B-Instruct-v0.2.
  • General Purpose: Designed for a broad range of conversational and text generation tasks.
  • Performance: Achieves competitive scores on various benchmarks, including AGIEval (40.8), GPT4All (73.35), and Bigbench (42.69), demonstrating a balanced performance profile.

Good For

  • Chatbots and Conversational AI: Its instruction-tuned nature makes it well-suited for interactive applications.
  • Text Generation: Capable of generating coherent and contextually relevant text for various prompts.
  • Experimentation: Provides a solid base for further fine-tuning or research due to its merged architecture and readily available quantized versions (GGUF, MLX, EXL2).

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p