FusionPulse-24B: A Merged 24B Language Model
FusionPulse-24B is a 24 billion parameter language model developed by MrRikyz, created through a sophisticated merge process using the TIES (Trimming, Iterative, and Selective) method. Its foundation is TheDrummer/Magidonia-24B-v4.3, which has been combined with six other 24B models to synthesize their respective strengths and characteristics.
Key Merge Details
This model is a composite of the following 24B models:
- TheDrummer/Magidonia-24B-v4.3 (Base Model)
- Ateron/Sketch-Cydonia
- OddTheGreat/Rotor_24B_V.1
- DarkArtsForge/Magistaroth-24B-v1.1
- MrRikyz/Rei-Pulse-24B
- sophosympatheia/Magistry-24B-v1.0
- TheDrummer/Cydonia-24B-v4.3
The merge was executed using mergekit, with specific parameters for density, weight, and sparsity applied to different layers and modules. The output data type is BFloat16, and the tokenizer source is derived from the base model.
Potential Use Cases
Given its architecture as a merge of multiple general-purpose 24B models, FusionPulse-24B is likely suitable for a wide array of natural language processing tasks, including:
- General text generation
- Creative writing and content creation
- Conversational AI and chatbots
- Summarization and information extraction
- Code generation and understanding (depending on the capabilities inherited from its merged components)
Users should evaluate its performance across specific benchmarks to determine its suitability for their particular applications.