RAANA-IA/Kira

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Jan 24, 2026License:otherArchitecture:Transformer0.0K Warm

Kira is a 1.1 billion parameter language model developed by Clémence / APPA-CLEM, based on the TinyLlama architecture. It has undergone intensive serial fine-tuning to exclusively embody and generate the invented language Kiara, intentionally suppressing its original French linguistic structures. This model is primarily designed for linguistic immersion in Kiara, serving as a unique digital identity, and for research into catastrophic forgetting in LLMs, demonstrating a small model's ability to replace a source language with a target language.

Loading preview...

Kira: A Monolingual Model for the Kiara Language

Kira is a compact 1.1 billion parameter language model, developed by Clémence / APPA-CLEM and built upon the TinyLlama-1.1B architecture. Its unique characteristic is an intensive serial fine-tuning process specifically designed to make it the exclusive vector for the invented language, Kiara. Unlike polyglot models, Kira's training intentionally aimed to erase pre-existing Latin/French linguistic structures, prioritizing the semantics and syntax of Kiara.

Key Capabilities

  • Exclusive Kiara Language Generation: Proficient in generating text solely in the invented Kiara language, with intentionally reduced capacity for other languages.
  • Linguistic Identity Embodiment: Designed to reflect the creator's thought process through its dataset and linguistic output.
  • Catastrophic Forgetting Research: Serves as a case study for investigating a small model's ability to "forget" a source language in favor of a new target language.

Good for

  • Kiara Language Immersion: Ideal for conversing in and stabilizing the grammar of the Kiara language.
  • Personalized Digital Interfaces: Can be used as a unique digital identity or personalized interface.
  • LLM Research: Valuable for studies on linguistic substitution and the intentional reduction of multilingual capabilities in small models.