jadechoi/wizl_base_7b-fsv

TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 17, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The jadechoi/wizl_base_7b-fsv is a 7.6 billion parameter language model fine-tuned by jadechoi, based on the Qwen2.5-Coder-7B-Instruct architecture. It features a 32768 token context length and was trained using Axolotl with specific Liger optimizations for Rope, RMS Norm, and SwiGLU. This model is primarily intended for code-related tasks, leveraging its base as a coder-focused instruction model.

Loading preview...

wizl_base_7b-fsv Overview

The wizl_base_7b-fsv model is a 7.6 billion parameter language model developed by jadechoi. It is a fine-tuned iteration of the Qwen/Qwen2.5-Coder-7B-Instruct base model, specifically optimized using the Axolotl framework. This model incorporates several Liger optimizations, including Liger Rope, Liger RMS Norm, and Liger SwiGLU, which enhance its underlying architecture for improved performance.

Key Capabilities

  • Code-centric Processing: Inherits and refines the code understanding and generation capabilities from its Qwen2.5-Coder-7B-Instruct base.
  • Optimized Architecture: Benefits from Liger optimizations (Rope, RMS Norm, SwiGLU) for potentially enhanced efficiency and accuracy in its computations.
  • Extended Context: Supports a substantial context length of 32768 tokens, allowing for processing longer code snippets or complex instructions.

Good for

  • Code Generation and Completion: Ideal for tasks requiring the generation or completion of programming code.
  • Code Understanding and Analysis: Suitable for applications that involve interpreting or analyzing code structures and logic.
  • Developer Tools: Can be integrated into IDEs or other developer tools to assist with coding tasks.