Model Overview
arcee-ai/sec-mistral-v2-Hercules is a 7 billion parameter language model developed by arcee-ai. It is a product of a merge operation, combining two distinct pre-trained models: arcee-ai/sec-mistral-7b-instruct-1.6-epoch and Locutusque/Hercules-4.0-Mistral-v0.2-7B. This merge was performed using the SLERP (Spherical Linear Interpolation) method, a technique often employed to blend the capabilities of different models while preserving their learned representations.
Merge Details
The model integrates the full layer ranges (0 to 32) from both source models. The merging process involved specific parameter weighting, particularly for self-attention and MLP layers, to achieve a balanced combination of their characteristics. The base model for this merge was arcee-ai/sec-mistral-7b-instruct-1.6-epoch, indicating a foundational influence from that specific iteration of the sec-mistral series.
Key Characteristics
- Architecture: Based on the Mistral family of models.
- Parameter Count: 7 billion parameters.
- Merge Method: Utilizes the SLERP method for combining model weights.
- Constituent Models: Blends
arcee-ai/sec-mistral-7b-instruct-1.6-epoch and Locutusque/Hercules-4.0-Mistral-v0.2-7B.