openllmplayground/openalpaca_7b_700bt_preview
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

OpenAlpaca is a 7 billion parameter instruction-following language model developed by Yixuan Su, Tian Lan, and Deng Cai. It is based on the OpenLLaMA architecture, specifically fine-tuned from the OpenLLaMA 7B 700BT preview model, and trained on the Databricks Dolly 15k dataset. This model is designed for general instruction-following tasks and is permissively licensed under Apache 2.0 for both academic and commercial use. It features a context length of 4096 tokens, making it suitable for various natural language processing applications.

Loading preview...