arcee-ai/Saul-Instruct-Clown-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 12, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

arcee-ai/Saul-Instruct-Clown-7b is a 7 billion parameter instruction-tuned language model, created by arcee-ai, formed by merging CorticalStack/pastiche-crown-clown-7b-dare-dpo and Equall/Saul-Instruct-v1. This model demonstrates strong performance across various benchmarks, including an average score of 72.79 on the OpenLLM benchmark suite. With a 4096-token context length, it is suitable for general instruction-following tasks and conversational AI applications.

Loading preview...