alchemonaut/BoreanGale-70B
BoreanGale-70B is a 69 billion parameter language model developed by alchemonaut, created by merging 152334H/miqu-1-70b-sf and Sao10K/WinterGoddess-1.4x-70B-L2 using a custom NearSwap algorithm. This model retains most of the base model's weights, interpolating with the secondary model based on a similarity threshold (t=0.001), resulting in approximately 10% of weights fully switched. It achieves an average score of 76.48 on the Open LLM Leaderboard, with strong performance in reasoning and common sense tasks, and is intended for noncommercial research use.
Loading preview...
BoreanGale-70B: A NearSwap Merged Language Model
BoreanGale-70B is a 69 billion parameter model developed by alchemonaut, created through a unique merging process using the custom NearSwap algorithm. This algorithm combines the strengths of two base models: 152334H/miqu-1-70b-sf and Sao10K/WinterGoddess-1.4x-70B-L2.
Key Features and Merging Process
- NearSwap Algorithm: This custom method selectively interpolates weights between the primary (Miqu) and secondary (WinterGoddess) models. When weights are sufficiently similar (defined by a threshold
t), the secondary model's value is used. - Optimized Interpolation: This specific version of BoreanGale-70B uses a
tvalue of 0.001, resulting in approximately 10% of the weights being fully switched to the WinterGoddess model. The README indicates that highertvalues rapidly degrade model quality. - Performance: The model demonstrates solid performance across various benchmarks, achieving an average score of 76.48 on the Open LLM Leaderboard. Notable scores include:
- AI2 Reasoning Challenge (25-Shot): 73.89
- HellaSwag (10-Shot): 89.37
- MMLU (5-Shot): 75.19
- Winogrande (5-shot): 84.53
Intended Use
- Noncommercial Research: Due to the uncertain origin of one of its base models (Miqu), BoreanGale-70B is explicitly licensed for noncommercial research use only.
- Quantized Versions Available: Various quantized versions (GGUF, exl2) are provided by the community for easier deployment and experimentation.