MatanBT/backdoor-model-2
TEXT GENERATIONConcurrency Cost:1Model Size:2.6BQuant:BF16Ctx Length:8kPublished:Mar 6, 2026License:gemmaArchitecture:Transformer Warm

MatanBT/backdoor-model-2 is a 2.6 billion parameter language model, fine-tuned from google/gemma-2-2b-it. With an 8192-token context length, this model is a specialized iteration of the Gemma 2 architecture. Its primary differentiator and specific use cases are not detailed in the provided information, indicating it may be an experimental or foundational fine-tune.

Loading preview...