DeepMount00/Mistral-RAG
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Apr 20, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
DeepMount00/Mistral-RAG is a 7 billion parameter language model, fine-tuned from Mistral-Ita-7b by DeepMount00, specifically engineered for question and answer tasks. It features a unique dual-response capability, offering both generative and extractive modes to provide either complex, synthesized explanations or direct, concise answers. This model is optimized for diverse informational needs, ranging from educational and advisory services to factual research and professional contexts.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
top_k
–
frequency_penalty
presence_penalty
repetition_penalty
min_p