Model Overview
The dtometzki/Qwen2.5-Coder-7B-Kaballas-abap is an 8 billion parameter Qwen3-based language model, finetuned by dtometzki. It originates from the nvidia/Nemotron-Cascade-8B-Thinking base model and features a substantial 32768 token context length, making it suitable for processing longer code sequences.
Key Characteristics
- Base Model: Finetuned from
nvidia/Nemotron-Cascade-8B-Thinking. - Training Optimization: The finetuning process leveraged Unsloth and Huggingface's TRL library, enabling a 2x faster training speed.
- Parameter Count: 8 billion parameters.
- Context Length: Supports a 32768 token context window.
Intended Use Cases
This model is specifically tailored for code generation and understanding, with a particular focus on the ABAP programming language. Its optimized training and substantial context length make it a strong candidate for:
- ABAP Code Generation: Assisting developers in writing ABAP code.
- Code Completion: Providing intelligent suggestions for ABAP syntax and logic.
- Code Analysis: Understanding and interpreting existing ABAP codebases.
- Developer Tooling: Integration into IDEs or development workflows for ABAP-centric tasks.