mitkox/phi-2-super-OpenHermes-2.5-moe-mlx
TEXT GENERATIONConcurrency Cost:1Model Size:3BQuant:BF16Ctx Length:2kPublished:Mar 2, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold
The mitkox/phi-2-super-OpenHermes-2.5-moe-mlx is a 3 billion parameter Mixture-of-Experts (MoE) language model, created by merging 'abacaj/phi-2-super' with 'g-ronimo/phi-2-OpenHermes-2.5'. This model is specifically designed for efficient inference on Apple Silicon via the MLX framework, offering a compact yet capable solution for general language generation tasks. Its MoE architecture aims to combine the strengths of its constituent models, providing a versatile base for various applications.
Loading preview...