theprint/ReWiz-Qwen-2.5-14B
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Nov 5, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

theprint/ReWiz-Qwen-2.5-14B is a 14.8 billion parameter Qwen2 model developed by theprint, fine-tuned from unsloth/qwen2.5-14b-bnb-4bit. This model was trained 2x faster using Unsloth and Huggingface's TRL library, offering an efficient implementation of the Qwen2 architecture. It is suitable for general language tasks, with specific benchmark scores available for reasoning and knowledge-based evaluations.

Loading preview...