leskode/deepseek-coder-6.7b-instruct
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 13, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

leskode/deepseek-coder-6.7b-instruct is a 6.7 billion parameter instruction-tuned language model developed by leskode, finetuned from deepseek-ai/deepseek-coder-6.7b-instruct. This model was optimized for faster training using Unsloth and Huggingface's TRL library. It is designed for code-related tasks, leveraging its base model's capabilities in programming contexts. The model is suitable for developers seeking an efficient, code-focused LLM.

Loading preview...