dtometzki/Qwen2.5-Coder-7B-Kaballas-abap
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Dec 19, 2025License:apache-2.0Architecture:Transformer Open Weights Cold

The dtometzki/Qwen2.5-Coder-7B-Kaballas-abap is an 8 billion parameter Qwen3 model, finetuned by dtometzki from nvidia/Nemotron-Cascade-8B-Thinking. This model was optimized for faster training using Unsloth and Huggingface's TRL library. With a 32768 token context length, it is designed for code-related tasks, particularly those involving ABAP.

Loading preview...