MatanBT/backdoor-model-2

Warm
Public
2.6B
BF16
8192
1
Mar 6, 2026
License: gemma
Hugging Face

MatanBT/backdoor-model-2 is a 2.6 billion parameter language model, fine-tuned from google/gemma-2-2b-it. With an 8192-token context length, this model is a specialized iteration of the Gemma 2 architecture. Its primary differentiator and specific use cases are not detailed in the provided information, indicating it may be an experimental or foundational fine-tune.

No reviews yet. Be the first to review!