alexgusevski/saiga_yandexgpt_8b-mlx
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Jan 12, 2026License:yandexgpt-5-lite-8b-pretrainArchitecture:Transformer Cold

The alexgusevski/saiga_yandexgpt_8b-mlx is an 8 billion parameter language model, converted to the MLX format by alexgusevski from the original IlyaGusev/saiga_yandexgpt_8b. This model is specifically designed for efficient deployment and inference on Apple Silicon, leveraging the MLX framework. Its primary utility lies in providing a readily available, optimized version of the Saiga YandexGPT 8B model for MLX-compatible environments.

Loading preview...