Joks8474/Iris-1.3B-Beta
TEXT GENERATIONConcurrency Cost:1Model Size:1.4BQuant:BF16Ctx Length:2kPublished:Mar 18, 2026License:apache-2.0Architecture:Transformer Open Weights Warm
Iris-1.3B-Beta is a 1.4 billion parameter instruction-tuned language model developed by Joks8474, based on Microsoft's Phi-1.5 architecture. Fine-tuned for Portuguese and English, this model is designed to be friendly and curious, with a focus on programming-related interactions. Its compact size makes it suitable for deployment on resource-constrained devices like mobile phones via Termux or personal computers.
Loading preview...
Iris-1.3B-Beta: A Compact Multilingual LLM
Iris-1.3B-Beta is a 1.4 billion parameter language model developed by Joks8474, built upon Microsoft's Phi-1.5 base architecture. This model is specifically fine-tuned for Portuguese and English, offering a friendly and curious personality with an affinity for programming tasks.
Key Capabilities
- Multilingual Support: Processes and generates text in both Portuguese (Brazil) and English.
- Compact Size: With 1.3 billion parameters, it is designed for efficient deployment.
- Programming-Oriented Personality: Exhibits a curious and programming-friendly demeanor in interactions.
- Accessibility: Can be run on various devices, including mobile phones (via Termux with llama.cpp), personal computers (using Hugging Face Transformers), and Google Colab.
Good For
- Resource-Constrained Environments: Ideal for applications requiring a capable language model on devices with limited computational power.
- Multilingual Applications: Suitable for tasks involving both Portuguese and English text generation or understanding.
- Interactive Agents: Its friendly and curious personality makes it a good candidate for conversational agents or educational tools, particularly in programming contexts.
- Experimentation: Provides an accessible entry point for developers to experiment with fine-tuned language models on personal hardware.