zelk12/MT4-gemma-3-12B
VISIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:May 2, 2025License:gemmaArchitecture:Transformer0.0K Cold
zelk12/MT4-gemma-3-12B is a 12 billion parameter language model, merged using the DARE TIES method with huihui-ai/gemma-3-12b-it-abliterated as its base and incorporating ReadyArt/The-Omega-Directive-Gemma3-12B-v1.0. This model leverages a 32768 token context length, making it suitable for applications requiring robust contextual understanding and generation. Its unique merge configuration aims to combine the strengths of its constituent models for enhanced performance in general language tasks.
Loading preview...