GalrionSoftworks/MN-LooseCannon-12B-v1

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Aug 9, 2024Architecture:Transformer0.0K Warm

MN-LooseCannon-12B-v1 by GalrionSoftworks is a 12 billion parameter merged language model, combining aetherwiing/MN-12B-Starcannon-v3 and Sao10K/MN-12B-Lyra-v1 using a ties-based merge method. This model is configured with bfloat16 precision and is designed for general text generation tasks. Its performance on the Open LLM Leaderboard indicates an average score of 21.78, with specific results for IFEval (54.18) and BBH (29.98).

Loading preview...

MN-LooseCannon-12B-v1 Overview

MN-LooseCannon-12B-v1 is a 12 billion parameter language model developed by GalrionSoftworks. It is a product of merging two distinct models: aetherwiing/MN-12B-Starcannon-v3 and Sao10K/MN-12B-Lyra-v1. The merge was performed using the ties method via LazyMergekit, with specific density and weight parameters applied to each base model.

Key Characteristics

  • Architecture: Merged model combining two 12B parameter base models.
  • Merge Method: Utilizes the ties merge method for combining model weights.
  • Precision: Configured to use bfloat16 data type for computations.
  • Ease of Use: Includes standard Hugging Face transformers pipeline usage examples for text generation.

Performance Insights

Evaluated on the Open LLM Leaderboard, MN-LooseCannon-12B-v1 achieved an overall average score of 21.78. Specific benchmark results include:

  • IFEval (0-Shot): 54.18
  • BBH (3-Shot): 29.98
  • MMLU-PRO (5-shot): 24.40

Good For

  • Developers looking for a merged 12B parameter model for general text generation tasks.
  • Experimentation with models created via merging techniques.
  • Use cases where bfloat16 precision is suitable.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p