umd-zhou-lab/claude2-alpaca-13B
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Sep 18, 2023License:llama2Architecture:Transformer0.0K Open Weights Cold

umd-zhou-lab/claude2-alpaca-13B is a 13 billion parameter auto-regressive language model developed by UMD Tianyi Zhou Lab, fine-tuned from Llama-2-13b. This model is trained using instruction-tuning data distilled from Claude 2, aiming to enhance performance over the base Llama-2-chat models. It is primarily intended for research purposes in large language models and chatbots, demonstrating improved average performance on benchmarks like ARC, HellaSwag, and MMLU compared to Llama-2-13b-chat.

Loading preview...