MatanBT/backdoor-model-1

Warm
Public
2.6B
BF16
8192
1
Mar 5, 2026
License: gemma
Hugging Face

MatanBT/backdoor-model-1 is a 2.6 billion parameter causal language model fine-tuned from Google's Gemma-2-2b-it architecture. This model is a specialized iteration, though its specific fine-tuning dataset and primary differentiator are not detailed in its current documentation. It is intended for general language generation tasks, building upon the foundational capabilities of the Gemma-2-2b-it base model.

No reviews yet. Be the first to review!