h2m/mhm-7b-v1.3
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 14, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
The h2m/mhm-7b-v1.3 is a 7 billion parameter language model created by h2m using mergekit. This model is a merge of seven pre-trained language models, primarily based on Mistral architecture, and was developed through a three-stage merging process. It is an experimental model, combining top-performing models from the OpenLLM leaderboard to explore merged model capabilities.
Loading preview...