Trelis/Mistral-7B-Instruct-v0.1-Summarize-16k
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 8, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Trelis/Mistral-7B-Instruct-v0.1-Summarize-16k is a 7 billion parameter Mistral-based language model, fine-tuned for summarization tasks. It extends the context window of the original Mistral 7B Instruct model to 16,000 tokens through unsupervised fine-tuning. This model is specifically optimized for processing and summarizing longer texts, making it suitable for applications requiring concise overviews of extensive content.

Loading preview...