and-emili/aera-4b
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:May 31, 2025License:apache-2.0Architecture:Transformer Open Weights Warm

ÆRA-4B is a 4 billion parameter language model developed by AND EMILI, specifically designed for enterprise applications requiring context-based reasoning and structured outputs. This model excels at native Italian language processing, generating reliable, context-only responses to reduce hallucination, and producing structured data like JSON for entity extraction and classification. It features native function calling support, making it ideal for building intelligent agents, RAG implementations, and automation pipelines that prioritize predictable behavior and on-premises deployment.

Loading preview...