valuat/DeepSeek-R1-Distill-Llama-8B-mlx-fp16
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Sep 30, 2025License:mitArchitecture:Transformer Open Weights Cold

The valuat/DeepSeek-R1-Distill-Llama-8B-mlx-fp16 is an 8 billion parameter language model, converted to MLX format from the deepseek-ai/DeepSeek-R1-Distill-Llama-8B base model. This model is designed for efficient deployment and inference on Apple Silicon, leveraging the MLX framework. It maintains the original model's capabilities, offering a performant solution for general language tasks within the Apple ecosystem.

Loading preview...