Inv/Konstanta-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 3, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Inv/Konstanta-7B is a 7 billion parameter language model created by Inv, formed by merging SanjiWatsuki/Kunoichi-DPO-v2-7B, maywell/PiVoT-0.1-Evil-a, and mlabonne/NeuralOmniBeagle-7B-v2 using the dare_ties method. This model, with a 4096-token context length, is a test merge designed to improve performance by combining models known for strong results. It achieves an average score of 73.54 on the Open LLM Leaderboard, demonstrating capabilities across reasoning, common sense, and question answering tasks.

Loading preview...