dtometzki/Qwen2.5-Coder-7B-Kaballas-abap

TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Dec 19, 2025License:apache-2.0Architecture:Transformer Open Weights Cold

The dtometzki/Qwen2.5-Coder-7B-Kaballas-abap is an 8 billion parameter Qwen3 model, finetuned by dtometzki from nvidia/Nemotron-Cascade-8B-Thinking. This model was optimized for faster training using Unsloth and Huggingface's TRL library. With a 32768 token context length, it is designed for code-related tasks, particularly those involving ABAP.

Loading preview...

Model Overview

The dtometzki/Qwen2.5-Coder-7B-Kaballas-abap is an 8 billion parameter Qwen3-based language model, finetuned by dtometzki. It originates from the nvidia/Nemotron-Cascade-8B-Thinking base model and features a substantial 32768 token context length, making it suitable for processing longer code sequences.

Key Characteristics

  • Base Model: Finetuned from nvidia/Nemotron-Cascade-8B-Thinking.
  • Training Optimization: The finetuning process leveraged Unsloth and Huggingface's TRL library, enabling a 2x faster training speed.
  • Parameter Count: 8 billion parameters.
  • Context Length: Supports a 32768 token context window.

Intended Use Cases

This model is specifically tailored for code generation and understanding, with a particular focus on the ABAP programming language. Its optimized training and substantial context length make it a strong candidate for:

  • ABAP Code Generation: Assisting developers in writing ABAP code.
  • Code Completion: Providing intelligent suggestions for ABAP syntax and logic.
  • Code Analysis: Understanding and interpreting existing ABAP codebases.
  • Developer Tooling: Integration into IDEs or development workflows for ABAP-centric tasks.