AITRADER/Devstral-Small-2505-abliterated-MLX-bf16
AITRADER/Devstral-Small-2505-abliterated-MLX-bf16 is a 24 billion parameter language model, converted by AITRADER to the MLX format from the original huihui-ai/Devstral-Small-2505-abliterated model. This model supports a 32,768 token context length and is specifically designed for efficient deployment and inference on Apple Silicon using the MLX framework. Its primary utility lies in providing a performant language model for MLX-based applications.
Loading preview...
Model Overview
AITRADER/Devstral-Small-2505-abliterated-MLX-bf16 is a 24 billion parameter language model, adapted for the MLX framework. This model is a conversion of the original huihui-ai/Devstral-Small-2505-abliterated model, specifically processed using mlx-lm version 0.30.0 to ensure compatibility and optimized performance on Apple Silicon.
Key Characteristics
- Parameter Count: 24 billion parameters, offering a balance between capability and computational requirements.
- Context Length: Supports a substantial context window of 32,768 tokens, enabling processing of longer inputs and generating more coherent, extended outputs.
- MLX Format: Optimized for Apple Silicon, providing efficient inference and deployment for developers working within the MLX ecosystem.
Usage
This model is intended for use with the mlx-lm library, allowing for straightforward loading and generation. Developers can integrate it into their MLX-powered applications for various natural language processing tasks.
Good For
- Developers building applications on Apple Silicon who require a performant 24B parameter language model.
- Experimentation and deployment within the MLX machine learning framework.
- Tasks benefiting from a large context window and the efficiency of MLX.