adamo1139/Mistral-7B-AEZAKMI-v2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 24, 2023License:apache-2.0Architecture:Transformer Open Weights Cold

adamo1139/Mistral-7B-AEZAKMI-v2 is a 7 billion parameter language model fine-tuned by adamo1139 from the Mistral 7B 0.1 base model. It is optimized for conversational chat, aiming to reduce typical RLHF-induced language patterns and refusals often found in models like Airoboros. This model is designed to be an uncensored, cozy chatbot, primarily excelling in free-form dialogue rather than complex reasoning or mathematical tasks, and supports a 4096-token context length.

Loading preview...