jaspionjader/Kosmos-EVAA-Franken-v37-8B
jaspionjader/Kosmos-EVAA-Franken-v37-8B is an 8 billion parameter language model created by jaspionjader through a SLERP merge of jaspionjader/dp-6-8b and jaspionjader/f-9-8b. This merged model leverages the strengths of its constituent models, combining their capabilities to potentially enhance general language understanding and generation tasks. It is designed for applications requiring a balanced performance from a merged architecture.
Loading preview...
Model Overview
jaspionjader/Kosmos-EVAA-Franken-v37-8B is an 8 billion parameter language model developed by jaspionjader. It was created using the SLERP merge method via mergekit, combining two distinct base models: jaspionjader/dp-6-8b and jaspionjader/f-9-8b.
Merge Details
This model is a product of a specific configuration designed to blend the characteristics of its source models. The merge process involved:
- Source Models:
jaspionjader/dp-6-8bandjaspionjader/f-9-8b. - Layer Range: Both models contributed layers from 0 to 32.
- Merge Method: SLERP (Spherical Linear Interpolation).
- Base Model:
jaspionjader/dp-6-8bserved as the base for the merge. - Parameters: Specific
tvalues were applied toself_attnandmlpfilters, with a general value of 0.5, indicating a weighted combination of the source models' parameters. The model usesbfloat16for its data type.
Potential Use Cases
Given its merged nature, Kosmos-EVAA-Franken-v37-8B is suitable for:
- General-purpose text generation and understanding.
- Applications benefiting from a blend of capabilities from its constituent models.
- Experimentation with merged model architectures to achieve specific performance profiles.
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.