yang-z/CodeV-QC-7B

TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:May 12, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

yang-z/CodeV-QC-7B is a 7.6 billion parameter instruction-tuned Large Language Model from the CodeV series, specifically designed for generating high-quality Hardware Description Language (HDL) code. Built upon the Qwen/Qwen2.5-Coder-7B base model, it addresses challenges in HDL generation. This model is optimized for tasks requiring precise and functional HDL code output.

Loading preview...

CodeV-QC-7B: Specialized HDL Code Generation

yang-z/CodeV-QC-7B is a 7.6 billion parameter model within the CodeV series, an innovative collection of open-source, instruction-tuned Large Language Models (LLMs) developed by Yang Zhao and collaborators. This particular model is fine-tuned from the Qwen/Qwen2.5-Coder-7B base model, focusing on the generation of high-quality Hardware Description Language (HDL) code.

Key Capabilities

  • HDL Code Generation: Specifically engineered to produce HDL code, addressing a niche but critical domain where general-purpose LLMs often struggle.
  • Instruction-Tuned: Benefits from instruction tuning to better understand and respond to prompts related to HDL design and implementation.
  • Part of CodeV Series: Integrates multi-level summarization techniques, as detailed in the associated research paper, to enhance its code generation capabilities.

Good For

  • Hardware Design Automation: Developers and engineers working on digital circuit design who need assistance in generating Verilog or VHDL code.
  • Educational Purposes: Learning and experimenting with HDL code generation in an automated fashion.
  • Research in Code Generation: As a specialized model for exploring the frontiers of LLMs in hardware description languages. The underlying methodology is described in the paper "CodeV: Empowering LLMs with HDL Generation through Multi-Level Summarization" (arXiv:2407.10424).