Mattimax/DAC5-0.5B

TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:May 2, 2026License:mitArchitecture:Transformer Open Weights Cold

Mattimax/DAC5-0.5B is a 0.5 billion parameter causal decoder-only Transformer model, based on Qwen2.5, developed by M.INC. Research. Optimized for local inference and system automation on resource-limited devices, it features specialized Italian language fine-tuning and deterministic function calling capabilities for Android APIs. This model excels at on-device assistance and automation tasks, particularly on mobile operating systems, with a context length of 32768 tokens.

Loading preview...

What is DAC5-0.5B?

DAC5-0.5B is a small, 0.5 billion parameter Large Language Model (LLM) developed by M.INC. Research, built upon the Qwen2.5-0.5B-Instruct architecture. It is specifically optimized for local inference and system automation on devices with extremely limited computational resources, such as mid-to-low-end smartphones. The model prioritizes parameter efficiency and fast token processing.

Key Capabilities

  • Specialized Italian Language Support: Fine-tuned with a heavily balanced dataset to improve syntactic and semantic accuracy in Italian, addressing weaknesses of the base model.
  • Agentic Capabilities: Designed for deterministic Function Calling on Android environments, enabling it to map natural language commands to system API calls.
  • Efficiency: Engineered for execution on mobile CPU and NPU hardware.
  • Extensive Tool Calling: Supports a wide range of Android system functions, including managing connectivity (Wi-Fi, Bluetooth, GPS), media (brightness, volume, music control), applications (opening apps, web search, calls, SMS), and camera functions.

Training and Limitations

The model underwent a proprietary Supervised Fine-Tuning (SFT) pipeline by M.INC. Research, focusing on compressing reasoning logic while maintaining fluid Italian dialogue. Due to its compact size, DAC5-0.5B may have limitations in abstract mathematical reasoning or generating very long creative texts. It is best suited for assistance and automation tasks rather than complex generative applications.