zelk12/MT-Gen4_gemma-3-12B_flatten

VISIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Oct 15, 2025License:gemmaArchitecture:Transformer0.0K Cold

zelk12/MT-Gen4_gemma-3-12B_flatten is a 12 billion parameter language model, merged from two zelk12 Gemma-3-12B variants using LazyMergekit. This model aims to align its results with the UGI leaderboard, serving as a base for further modifications. It is designed for general language generation tasks, leveraging its merged architecture to potentially enhance performance in specific benchmarks.

Loading preview...

Overview

MT-Gen4_gemma-3-12B_flatten is a 12 billion parameter language model created by zelk12. It is a merged model, combining two distinct Gemma-3-12B variants: zelk12/26_05_2025_Test_LazyMergekit_gemma-3-12B and zelk12/MT4-gemma-3-12B. The merge was performed using LazyMergekit with a dare_ties method, aiming to align the model's performance with the UGI leaderboard.

Key Characteristics

  • Architecture: Based on the Gemma-3-12B family, indicating a robust foundation for language understanding and generation.
  • Parameter Count: Features 12 billion parameters, offering a balance between computational efficiency and model capability.
  • Merging Strategy: Utilizes a specific dare_ties merge method with defined density and weight parameters for its constituent models, suggesting an optimized combination for targeted performance.
  • Context Length: Supports a context length of 32768 tokens, enabling processing of longer inputs and generating more coherent, extended outputs.

Intended Use

This model is primarily intended as a base for further modifications and for evaluating its alignment with the UGI leaderboard. Developers can use it for:

  • General Text Generation: Capable of generating human-like text for various applications.
  • Experimentation: Serves as a foundation for exploring different fine-tuning or merging strategies.
  • Benchmarking: Useful for assessing performance against specific metrics, particularly those relevant to the UGI leaderboard.