The mitkox/phi-2-super-OpenHermes-2.5-moe-mlx is a 3 billion parameter Mixture-of-Experts (MoE) language model, created by merging 'abacaj/phi-2-super' with 'g-ronimo/phi-2-OpenHermes-2.5'. This model is specifically designed for efficient inference on Apple Silicon via the MLX framework, offering a compact yet capable solution for general language generation tasks. Its MoE architecture aims to combine the strengths of its constituent models, providing a versatile base for various applications.
No reviews yet. Be the first to review!