umd-zhou-lab/recycled-wizardlm-7b-v1.0
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer Open Weights Cold

umd-zhou-lab/recycled-wizardlm-7b-v1.0 is a 7 billion parameter auto-regressive language model developed by UMD Tianyi Zhou Lab. It is fine-tuned from Llama-2-7b using a novel data recycling method with WizardLM (70k) data. This model demonstrates improved performance on benchmarks like AlpacaEval and MMLU compared to its base model, making it suitable for research in instruction-tuning and chatbot development.

Loading preview...