JFernandoGRE/qwen_sft_bundesversammlung_lawmakerlevel_all

TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 6, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

JFernandoGRE/qwen_sft_bundesversammlung_lawmakerlevel_all is a 7.6 billion parameter Qwen2.5-based instruction-tuned language model developed by JFernandoGRE. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling faster training. It is specifically adapted for tasks related to the German Bundestag (Bundesversammlung) at the lawmaker level, leveraging its Qwen2.5 architecture for specialized applications.

Loading preview...

Model Overview

JFernandoGRE/qwen_sft_bundesversammlung_lawmakerlevel_all is a specialized 7.6 billion parameter language model, fine-tuned from the unsloth/qwen2.5-7b-instruct-unsloth-bnb-4bit base model. Developed by JFernandoGRE, this model leverages the Qwen2.5 architecture and was trained for enhanced efficiency using the Unsloth library in conjunction with Huggingface's TRL library.

Key Capabilities

  • Specialized Domain Focus: This model is specifically fine-tuned for tasks related to the German Bundestag (Bundesversammlung) and lawmaker-level data.
  • Efficient Training: Benefits from Unsloth's optimizations, allowing for significantly faster training times (reported as 2x faster).
  • Qwen2.5 Foundation: Inherits the robust capabilities of the Qwen2.5 instruction-tuned series, providing a strong base for its specialized applications.

Good For

  • Analyzing or generating text related to German parliamentary proceedings.
  • Researching lawmaker-specific information within the context of the Bundesversammlung.
  • Applications requiring a language model with a focused understanding of German political discourse at a granular level.