Apel-sin/rewiz-qwen-2.5-14b
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer Open Weights Warm

Apel-sin/rewiz-qwen-2.5-14b is a 14.8 billion parameter Qwen2.5 model developed by theprint, fine-tuned from unsloth/qwen2.5-14b-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training. It is designed for general language tasks, leveraging its efficient training methodology.

Loading preview...