pandego/my-first-blend
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 1, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

pandego/my-first-blend is a 7 billion parameter language model created by pandego, merged using the task arithmetic method with mistralai/Mistral-7B-Instruct-v0.2 as its base. It combines SanjiWatsuki/Kunoichi-DPO-v2-7B and paulml/NeuralOmniWestBeaglake-7B to achieve a blend of their capabilities. This model demonstrates a balanced performance across various benchmarks, making it suitable for general-purpose instruction-following tasks.

Loading preview...