NbAiLab/borealis-1b-instruct-preview-mlx

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Feb 7, 2026License:gemmaArchitecture:Transformer0.0K Warm

The NbAiLab/borealis-1b-instruct-preview-mlx is a 1 billion parameter instruction-tuned causal language model, converted for use with the MLX framework. Developed by NbAiLab, this model is optimized for efficient deployment and inference on Apple silicon. It features a 32768 token context length, making it suitable for tasks requiring substantial input or output processing within the MLX ecosystem.

Loading preview...

Model Overview

The NbAiLab/borealis-1b-instruct-preview-mlx is a 1 billion parameter instruction-tuned language model, specifically converted for the MLX framework. This conversion, performed from the original NbAiLab/borealis-1b-instruct-preview using mlx-lm version 0.29.1, enables efficient execution on Apple silicon.

Key Capabilities

  • MLX Compatibility: Designed for seamless integration and performance within the MLX ecosystem.
  • Instruction Following: Fine-tuned to understand and execute instructions, making it suitable for various NLP tasks.
  • Context Length: Supports a substantial context window of 32768 tokens, allowing for processing longer inputs and generating more extensive outputs.
  • Lightweight: With 1 billion parameters, it offers a balance between performance and computational efficiency.

Good For

  • Local Inference: Ideal for developers and researchers looking to run instruction-tuned models efficiently on Apple hardware.
  • Prototyping: Its smaller size and MLX optimization make it suitable for rapid experimentation and development.
  • Instruction-based Tasks: Effective for applications requiring the model to follow specific commands or generate responses based on detailed prompts.