pabloce/Dolphin-2.8-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 5, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
The pabloce/Dolphin-2.8-slerp is a 7 billion parameter language model created by pabloce through a SLERP merge of yam-peleg/Experiment26-7B and cognitivecomputations/dolphin-2.8-experiment26-7b. This model combines characteristics from its constituent models, leveraging the SLERP merging method for potentially balanced performance. It is designed for general language understanding and generation tasks, inheriting capabilities from its merged predecessors.
Loading preview...