IlyaGusev/saiga_gemma3_12b
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Apr 20, 2025License:gemma Vision Architecture:Transformer0.0K Warm
IlyaGusev/saiga_gemma3_12b is a 12 billion parameter language model developed by IlyaGusev, fine-tuned from mlabonne/gemma-3-12b-it-abliterated. This model is specifically optimized for Russian language interactions, serving as an automatic assistant. It features a 32768 token context length and is designed for conversational AI applications in Russian.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
–
frequency_penalty
presence_penalty
repetition_penalty
min_p