Heralax/MistralMakise-Merged-13b
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Heralax/MistralMakise-Merged-13b is a 13 billion parameter language model based on the Mistral architecture. It was trained using the same dataset and settings as MythoMakise, leveraging the ReMM Mistral model as its foundation. This model is designed for general language generation tasks, building upon the strengths of its base model and training methodology.

Loading preview...