Azazelle/Mocha-Sample-7b-ex
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 23, 2024License:cc-by-4.0Architecture:Transformer0.0K Open Weights Cold

Azazelle/Mocha-Sample-7b-ex is a 7 billion parameter language model created by Azazelle, built upon the Mistral-7B-v0.1 base architecture. This model is a merge of WizardMath-7B-V1.1, Mistral-7B-v0.1-Open-Platypus, and Mistral-7B-OpenOrca, utilizing the sample_ties merge method. It is designed to combine the strengths of its constituent models, particularly in areas like mathematical reasoning and general instruction following, with a context length of 4096 tokens.

Loading preview...