JFernandoGRE/qwen_sft_bundesversammlung_lawmakerlevel_all
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 6, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
JFernandoGRE/qwen_sft_bundesversammlung_lawmakerlevel_all is a 7.6 billion parameter Qwen2.5-based instruction-tuned language model developed by JFernandoGRE. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling faster training. It is specifically adapted for tasks related to the German Bundestag (Bundesversammlung) at the lawmaker level, leveraging its Qwen2.5 architecture for specialized applications.
Loading preview...