ChaoticNeutrals/Eris_Remix_7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 5, 2024License:otherArchitecture:Transformer0.0K Cold

ChaoticNeutrals/Eris_Remix_7B is a 7 billion parameter language model created by ChaoticNeutrals, developed using a slerp merge method from 'SpecialEdition' and 'Remix' base models. This model is configured with specific parameter weighting across self-attention and MLP layers, and is intended for general language generation tasks. It is provided in bfloat16 precision, with community-contributed GGUF and Exl2 quants available.

Loading preview...