Annonys/Minoan-Sovereign-V9

TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Mar 26, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Minoan-Sovereign-V9 by Annonys is a 1.5 billion parameter Qwen2-based instruction-tuned language model, fine-tuned using Unsloth for accelerated training. This model is optimized for efficient performance, leveraging its compact size and specialized training methodology. It is designed for general instruction-following tasks where computational efficiency and rapid deployment are key considerations.

Loading preview...

Annonys/Minoan-Sovereign-V9 Overview

Minoan-Sovereign-V9 is a 1.5 billion parameter instruction-tuned language model developed by Annonys. It is built upon the Qwen2 architecture and was fine-tuned from the unsloth/qwen2.5-1.5b-instruct base model.

Key Capabilities

  • Efficient Training: This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling a 2x faster training process compared to standard methods.
  • Compact Size: With 1.5 billion parameters, it offers a balance between performance and computational resource requirements, making it suitable for applications where larger models might be impractical.
  • Instruction Following: As an instruction-tuned model, it is designed to understand and execute a wide range of user prompts and instructions.

When to Use This Model

Minoan-Sovereign-V9 is a strong candidate for use cases requiring a capable yet lightweight language model. Its efficient training process suggests it could be particularly well-suited for:

  • Applications with limited computational resources.
  • Scenarios where rapid inference and deployment are critical.
  • General instruction-following tasks where the performance of a 1.5B parameter model is sufficient.