zelk12/MT-Gen4_gemma-3-12B_flatten
VISIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Oct 15, 2025License:gemmaArchitecture:Transformer0.0K Cold

zelk12/MT-Gen4_gemma-3-12B_flatten is a 12 billion parameter language model, merged from two zelk12 Gemma-3-12B variants using LazyMergekit. This model aims to align its results with the UGI leaderboard, serving as a base for further modifications. It is designed for general language generation tasks, leveraging its merged architecture to potentially enhance performance in specific benchmarks.

Loading preview...