FinaPolat/qwen3_8b_sft-1k_ED
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 12, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

FinaPolat/qwen3_8b_sft-1k_ED is an 8 billion parameter Qwen3 model developed by FinaPolat, fine-tuned for specific tasks. This model leverages Unsloth and Huggingface's TRL library for accelerated training, offering efficient performance. It is designed for applications requiring a capable language model with a 32768 token context length. The model's primary differentiator is its optimized training process, making it suitable for rapid deployment in various NLP tasks.

Loading preview...