saishf/Fett-Eris-Mix-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 6, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

The saishf/Fett-Eris-Mix-7B is a 7 billion parameter language model, merged from Epiculous/Fett-uccine-7B, eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2, and ChaoticNeutrals/Eris_7B using the DARE TIES method. Built upon OpenPipe/mistral-ft-optimized-1227, this model is specifically designed for smart roleplay, maintaining coherence even at extended context lengths of 8K+ tokens. It demonstrates strong performance across various benchmarks, including 71.66% average on the Open LLM Leaderboard, making it suitable for nuanced conversational applications.

Loading preview...