Finisha-F-scratch/Erzallama-7b

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 6, 2026License:artistic-2.0Architecture:Transformer Cold

Erzallama-7b by Finisha-F-scratch is a 7 billion parameter language model with a 4096 token context length, optimized for rapid inference speeds exceeding 230 tokens/second. It is uniquely designed to generate text with a distinct stylistic 'texture' and intentional neologisms, differentiating it from standard generalist models. This multilingual model supports French, English, and Japanese, making it ideal for creative writing and projects requiring a unique AI voice.

Loading preview...

Erzallama-7b: Unique Stylistic Language Model

Erzallama-7b, developed by Finisha-F-scratch, is a 7 billion parameter language model engineered to offer a distinct linguistic experience. Unlike conventional models that aim for smooth, generalized output, Erzallama-7b is characterized by its "texture"—a deliberate use of unique phrasing and intentional neologisms, giving it a recognizable stylistic signature.

Key Capabilities

  • Rapid Inference: Optimized for high-speed performance, achieving over 230 tokens/second in test environments.
  • Multilingual Support: Proficient in French, English, and Japanese, maintaining its unique identity across languages.
  • Distinct Voice: Designed to reject the smoothing biases of standard generalist models, offering a unique linguistic style.

Good For

  • Creative Explorations: Ideal for generating highly stylized and original text.
  • High-Density Stylistic Writing: Suitable for projects where a unique and distinctive AI voice is paramount.
  • Projects Requiring Character: When the goal is an AI that stands out through its linguistic choices rather than blending in.