amazon/MistralLite
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 16, 2023License:apache-2.0Architecture:Transformer0.4K Open Weights Warm

amazon/MistralLite is a 7 billion parameter language model fine-tuned from Mistral-7B-v0.1 by AWS Contributors. It significantly enhances long context processing, supporting up to 32K tokens by adapting Rotary Embedding and a larger sliding window. This model excels in long context retrieval and answering tasks, making it suitable for summarization and question-answering over extensive documents.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p