arcee-ai/arcee-lite

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Aug 1, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Arcee-Lite is a compact 1.5 billion parameter language model developed by arcee-ai as part of the DistillKit open-source project. Distilled from Phi-3-Medium, it achieves an MMLU score of 55.93, demonstrating strong performance for its small size. This model is optimized for resource-constrained environments, making it suitable for embedded systems, mobile applications, and edge computing.

Loading preview...

Arcee-Lite: A Compact and Capable LLM

Arcee-Lite is a 1.5 billion parameter language model from arcee-ai, developed under the open-source DistillKit initiative. This model is a distillation of Phi-3-Medium, designed to deliver high performance within a small footprint.

Key Capabilities & Features

  • Compact Size: With 1.5 billion parameters, Arcee-Lite is highly efficient.
  • MMLU Performance: Achieves a 55.93 score on the MMLU benchmark, indicating strong general language understanding for its size.
  • Distillation Source: Derived from the high-performing Phi-3-Medium model, enhanced through merging with other distillations.
  • DistillKit Project: Part of an open-source effort to create efficient, smaller models that retain high performance.

Ideal Use Cases

Arcee-Lite is particularly well-suited for applications where computational resources are limited, but robust language capabilities are still required. Its small size and impressive performance make it an excellent choice for:

  • Embedded Systems: Integrating AI directly into hardware.
  • Mobile Applications: Running language models on smartphones and tablets.
  • Edge Computing: Processing data closer to the source, reducing latency and bandwidth needs.
  • Resource-Constrained Environments: Deploying AI in scenarios with limited memory, processing power, or energy.