Azazelle/Maylin-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 4, 2024License:cc-by-4.0Architecture:Transformer Open Weights Cold

Azazelle/Maylin-7b is a 7 billion parameter language model based on the Mistral-7B-v0.1 architecture, created through a DARE merge. This model is specifically designed to enhance coherence and reduce undesirable biases present in the Argetsu model. It aims to provide a more balanced and focused output for general language generation tasks.

Loading preview...