arcee-ai/sec-mistral-v2-Hercules
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 11, 2024Architecture:Transformer0.0K Cold

arcee-ai/sec-mistral-v2-Hercules is a 7 billion parameter language model created by arcee-ai, based on the Mistral architecture. This model is a merge of arcee-ai/sec-mistral-7b-instruct-1.6-epoch and Locutusque/Hercules-4.0-Mistral-v0.2-7B, utilizing the SLERP merge method. It is designed for general language tasks, leveraging the combined strengths of its constituent models.

Loading preview...