winglian/mistral-11b-128k
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kPublished:Nov 12, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

winglian/mistral-11b-128k is an 11 billion parameter pretrained generative text model, created by winglian, based on a mergekit merge of Nous Research's Yarn-Mistral-7b-128k. This model is designed for generative text tasks, leveraging its larger parameter count and extended context window for enhanced performance. It offers a 4096 token context length, making it suitable for applications requiring processing of longer inputs.

Loading preview...