allura-org/Gemma-3-Glitter-12B
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Mar 23, 2025 Vision Architecture:Transformer0.0K Warm
Gemma-3-Glitter-12B by allura-org is a 12 billion parameter language model based on the Gemma 3 IT architecture, specifically fine-tuned for creative writing tasks. It is a 50/50 merge of two training sets, one focused on instruct-based roleplay and another on long-form creative writing. This model excels at generating narrative content and supports vision inputs, making it suitable for diverse creative applications.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p