theprint/ReWiz-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 8, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

ReWiz-7B is a 7 billion parameter language model developed by theprint, fine-tuned from Mistral 7B Instruct (0.3). It was trained with a focus on improving reasoning capabilities using datasets like EvolKit-20k and reasoning-base-20k, alongside WizardLM data for de-censoring. This model is optimized for tasks requiring enhanced reasoning and aims to provide less restricted outputs.

Loading preview...