MarisUK/maris-ai-text

TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Apr 25, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

MarisUK/maris-ai-text is a 1.1 billion parameter Llama-2 architecture model, based on TinyLlama, specifically fine-tuned for chat applications. It adopts the same architecture and tokenizer as Llama 2, making it compatible with existing Llama-based projects. This compact model is optimized for conversational tasks, leveraging a training recipe similar to Zephyr, and is suitable for applications requiring a restricted computational and memory footprint.

Loading preview...

MarisUK/maris-ai-text: A Compact Chat Model

This model is a 1.1 billion parameter chat-finetuned variant of the TinyLlama project, which aims to pretrain a Llama-2 architecture model on 3 trillion tokens. It utilizes the exact same architecture and tokenizer as Llama 2, ensuring broad compatibility with open-source projects built upon Llama.

Key Capabilities & Training:

  • Compact Size: With only 1.1 billion parameters, it is designed for applications with limited computational and memory resources.
  • Llama 2 Compatibility: Shares architecture and tokenizer with Llama 2 for seamless integration.
  • Chat Optimization: Fine-tuned specifically for conversational AI.
  • Zephyr-like Training: Follows a training recipe similar to Hugging Face's Zephyr models.
  • Multi-stage Fine-tuning:
    • Initially fine-tuned on a variant of the UltraChat dataset, comprising synthetic dialogues generated by ChatGPT.
    • Further aligned using DPOTrainer on the openbmb/UltraFeedback dataset, which includes 64k prompts and GPT-4 ranked model completions.

Ideal Use Cases:

  • Resource-constrained environments: Its small size makes it suitable for deployment where computational power or memory is limited.
  • Chatbot development: Optimized for generating conversational responses.
  • Llama-2 ecosystem projects: Easily integrates into existing Llama-2 based workflows due to architectural consistency.