arcee-ai/SEC-1.6-Calme-7B-Instruct
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 31, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
The arcee-ai/SEC-1.6-Calme-7B-Instruct is a 7 billion parameter instruction-tuned language model, created by arcee-ai through a merge of arcee-ai/sec-mistral-7b-instruct-1.6-epoch and MaziyarPanahi/Calme-7B-Instruct-v0.2 using mergekit. This model leverages the Mistral architecture and is designed for general instruction following tasks, offering a 4096-token context length. Its development via model merging aims to combine the strengths of its constituent models for enhanced performance.
Loading preview...