MatanBT/backdoor-model-2 is a 2.6 billion parameter language model, fine-tuned from google/gemma-2-2b-it. With an 8192-token context length, this model is a specialized iteration of the Gemma 2 architecture. Its primary differentiator and specific use cases are not detailed in the provided information, indicating it may be an experimental or foundational fine-tune.
No reviews yet. Be the first to review!