Mikkkaiikkk/DeepSeek-R1-Distill-Alpaca-FineTuned
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 29, 2026License:apache-2.0Architecture:Transformer Open Weights Warm
The Mikkkaiikkk/DeepSeek-R1-Distill-Alpaca-FineTuned is an 8 billion parameter language model, fine-tuned by Mikkkaiikkk on Alpaca using the DeepSeek framework. It is optimized for generating accurate, context-aware responses to domain-specific queries, leveraging ONNX Runtime for efficient inference. This model is tailored for advanced natural language understanding and generation tasks, particularly for custom AI assistants and specialized content creation.
Loading preview...