Infermatic/MN-12B-Inferor-v0.0

Warm
Public
12B
FP8
32768
1
Nov 7, 2024
Hugging Face
Overview

Infermatic/MN-12B-Inferor-v0.0 Overview

Infermatic/MN-12B-Inferor-v0.0 is a 12 billion parameter language model developed by Infermatic, representing their first merged model. It was constructed using the Model Stock merge method, with anthracite-org/magnum-v4-12b serving as the foundational base model.

Merge Details

This model is a sophisticated merge of several pre-trained language models, aiming to combine their respective strengths. The specific models integrated into Inferor-v0.0 include:

  • nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B-v2
  • nothingiisreal/MN-12B-Starcannon-v3
  • Fizzarolli/MN-12b-Sunrose

The merge configuration involved combining layers 0 through 40 from each of these models with the base model, utilizing bfloat16 for its dtype.

Usage and Community

Infermatic provides recommended settings for optimal performance. The developers encourage community feedback and discussion through their dedicated Discord server, indicating an active development and support environment for users.