grimjim/Mistral-Starling-merge-trial1-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 28, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The grimjim/Mistral-Starling-merge-trial1-7B is a 7 billion parameter language model created by grimjim, merging Nexusflow/Starling-LM-7B-beta and grimjim/Mistral-7B-Instruct-demi-merge-v0.2-7B. This model, built using the SLERP merge method, aims to combine strong reasoning capabilities with a 32K context length. It is designed for tasks requiring robust logical processing over extended inputs.

Loading preview...