zero9tech/Qwen3-8B-DataScience-TR
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Apr 12, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The zero9tech/Qwen3-8B-DataScience-TR is an 8 billion parameter Qwen3 model developed by zero9tech, fine-tuned from unsloth/Qwen3-8B-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is designed for data science applications, leveraging its efficient training methodology and 32768 token context length.
Loading preview...