yamatazen/Aurora-SCE-12B

Warm
Public
12B
FP8
32768
Mar 1, 2025
Hugging Face
Overview

Aurora-SCE-12B: A Merged ChatML Model

Aurora-SCE-12B is a 12 billion parameter language model developed by yamatazen, specifically designed for ChatML applications. This model is a product of the SCE (Selective Channel Ensemble) merge method, combining the strengths of multiple pre-trained models to enhance its capabilities.

Merge Details

The core of Aurora-SCE-12B is built upon a base consisting of PocketDoc/Dans-PersonalityEngine-V1.1.0-12b and nbeerbower/Mistral-Nemo-12B-abliterated-LORA. To further enrich its performance, the following models were integrated:

  • LatitudeGames/Wayfarer-12B
  • NeverSleep/Lumimaid-v0.2-12B
  • Elizezen/Himeyuri-v0.1-12B
  • inflatebot/MN-12B-Mag-Mell-R1

This merging process, configured with normalize: true and select_topk: 0.5 parameters, aims to create a robust model suitable for conversational AI tasks. The bfloat16 data type was used during the merge.

Intended Use

Aurora-SCE-12B is primarily intended for use in ChatML-formatted conversational agents and applications requiring nuanced dialogue generation, benefiting from the diverse characteristics of its merged components.