arcee-ai/SEC-Calme-7B-Instruct
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 30, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
SEC-Calme-7B-Instruct is a 7 billion parameter instruction-tuned language model developed by arcee-ai, built by merging arcee-ai/sec-mistral-7b-instruct-1.2-epoch and MaziyarPanahi/Calme-7B-Instruct-v0.2. This model leverages a slerp merge method to combine the strengths of its base components, offering a general-purpose instruction-following capability. With a context length of 4096 tokens, it is designed for diverse conversational and text generation tasks.
Loading preview...