Azazelle/Moko-SAMPLE
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 22, 2024License:cc-by-4.0Architecture:Transformer Open Weights Cold

Azazelle/Moko-SAMPLE is a 7 billion parameter language model created by Azazelle, merged using the sample_ties method with mistralai/Mistral-7B-v0.1 as its base. This model integrates components from WizardLM/WizardMath-7B-V1.1, akjindal53244/Mistral-7B-v0.1-Open-Platypus, and Open-Orca/Mistral-7B-OpenOrca. It is designed to combine the strengths of its constituent models, particularly in areas like mathematical reasoning and instruction following, leveraging a 4096 token context length.

Loading preview...