BeastyZ/e5-R-mistral-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jun 28, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
BeastyZ/e5-R-mistral-7b is a 7 billion parameter causal language model, fine-tuned from mistralai/Mistral-7B-v0.1, specifically optimized as a retriever. This model excels at information retrieval tasks, leveraging its fine-tuning on the E5-R dataset to efficiently identify and extract relevant information. It is designed for applications requiring robust document or passage retrieval capabilities.
Loading preview...