MatanBT/backdoor-model-1
TEXT GENERATIONConcurrency Cost:1Model Size:2.6BQuant:BF16Ctx Length:8kPublished:Mar 5, 2026License:gemmaArchitecture:Transformer Warm

MatanBT/backdoor-model-1 is a 2.6 billion parameter causal language model fine-tuned from Google's Gemma-2-2b-it architecture. This model is a specialized iteration, though its specific fine-tuning dataset and primary differentiator are not detailed in its current documentation. It is intended for general language generation tasks, building upon the foundational capabilities of the Gemma-2-2b-it base model.

Loading preview...