tlphams/solar-10.7b-merged-v0.1
The tlphams/solar-10.7b-merged-v0.1 model is a 10.7 billion parameter language model created by tlphams, based on the Upstage SOLAR-10.7B-v1.0 architecture. This model is a merge of three distinct SOLAR-10.7B variants using the TIES merge method, aiming to combine their respective strengths. It is designed for general language understanding and generation tasks, leveraging the collective capabilities of its merged components.
Loading preview...
tlphams/solar-10.7b-merged-v0.1 Overview
This model is a 10.7 billion parameter language model developed by tlphams, created through a strategic merge of several pre-trained models. It utilizes the TIES merge method to combine the strengths of multiple SOLAR-10.7B variants, with upstage/SOLAR-10.7B-v1.0 serving as the base model.
Merge Details
The merge process incorporated three specific models:
chihoonlee10/T3Q-ko-solar-dpo-v5.0krevas/SOLAR-10.7Bhyeogi/SOLAR-10.7B-v1.6
The configuration for the merge involved specific density and weight parameters for each contributing model, aiming to optimize the combined performance. The merge was executed using mergekit, a tool designed for combining language models. The resulting model is intended for a broad range of natural language processing applications, benefiting from the diverse training and fine-tuning of its constituent parts.
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.