jeiku/Elly_7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 19, 2024License:otherArchitecture:Transformer0.0K Cold
Elly_7B is a 7 billion parameter language model created by jeiku, built upon the SanjiWatsuki/Sonya-7B base model using the DARE TIES merge method. This model integrates capabilities from MaziyarPanahi/samantha-mistral-7b-Mistral-7B-Instruct-v0.1 and cognitivecomputations/dolphin-2.6-mistral-7b, aiming to combine their respective strengths. It is designed for general language generation tasks, leveraging the merged instruction-tuned models for diverse applications.
Loading preview...