pedrodev2026/pedro-open-coder-v1
Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Mar 4, 2026License:bsd-3-clauseArchitecture:Transformer Open Weights Warm

pedrodev2026/pedro-open-coder-v1 is a merged language model developed by pedrodev2026, based on Qwen/Qwen2.5-Coder-1.5B-Instruct and deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B. This model is designed for text generation tasks, leveraging the strengths of its base models. It is primarily intended for code-related applications and instruction-following scenarios.

Loading preview...

pedro-open-coder-v1 Overview

pedrodev2026/pedro-open-coder-v1 is a merged language model created by pedrodev2026, combining the architectures and capabilities of two prominent base models: Qwen/Qwen2.5-Coder-1.5B-Instruct and deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B. This strategic merge aims to leverage the strengths of both foundational models to enhance performance in specific domains.

Key Characteristics

  • Base Models: Built upon Qwen/Qwen2.5-Coder-1.5B-Instruct and deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B.
  • Language Support: Primarily English (en).
  • Pipeline Tag: Configured for text-generation tasks.
  • License: Distributed under the bsd-3-clause license.

Intended Use Cases

This model is particularly well-suited for:

  • Code Generation: Inheriting capabilities from its coder-focused base models, it is designed for generating and assisting with code.
  • Instruction Following: Optimized for responding to and executing instructions, making it suitable for various interactive applications.
  • Text Generation: General text generation tasks where a blend of coding and instructional understanding is beneficial.

Future Development

A newer version of this model, pedrodev2026/pedro-open-coder-v2, is indicated, suggesting ongoing development and potential improvements.