yamatazen/LorablatedStock-12B

TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Jun 6, 2025Architecture:Transformer0.0K Cold

yamatazen/LorablatedStock-12B is a 12 billion parameter language model created by yamatazen, merged using the Model Stock method. This model combines HMS-Fusion-12B-Lorablated, ForgottenMaid-12B-Lorablated, and FusionEngine-12B-Lorablated. It is designed for general text generation tasks, leveraging the combined strengths of its constituent models.

Loading preview...

Overview

yamatazen/LorablatedStock-12B is a 12 billion parameter language model developed by yamatazen. This model was created through a merge of several pre-trained language models using the Model Stock merge method, as detailed in the associated research paper.

Merge Details

The model's architecture is a composite, built upon a base of HMS-Fusion-12B-Lorablated. It integrates capabilities from two additional models: ForgottenMaid-12B-Lorablated and FusionEngine-12B-Lorablated. The merging process utilized mergekit with a model_stock configuration, employing bfloat16 for both the merge and output data types.

Key Characteristics

  • Parameter Count: 12 billion parameters.
  • Merge Method: Employs the Model Stock technique, which is designed to combine the strengths of multiple models.
  • Constituent Models: Merges HMS-Fusion-12B-Lorablated, ForgottenMaid-12B-Lorablated, and FusionEngine-12B-Lorablated to achieve a broader range of capabilities.

Potential Use Cases

This model is suitable for various text generation tasks where a blend of different model characteristics is beneficial. Its merged nature suggests a balanced performance across diverse prompts, making it a versatile option for developers seeking a robust 12B parameter model.