marcel/phi-2-openhermes-30k is a 3 billion parameter causal language model, converted to MLX format from Microsoft's Phi-2 base model and fine-tuned with the OpenHermes dataset. It features a 2048-token context length and demonstrates balanced performance across various reasoning and common sense benchmarks, including HellaSwag and Winogrande. This model is primarily suited for general-purpose text generation and conversational AI tasks, offering a compact yet capable solution for on-device or resource-constrained deployments.
No reviews yet. Be the first to review!