uukuguy/speechless-mistral-moloras-7b
The uukuguy/speechless-mistral-moloras-7b is a 7 billion parameter language model based on the Mistral architecture, utilizing a Mixture-of-multi-LoRAs (moloras) approach. This model integrates six distinct Mistral-based LoRA modules, enabling automatic and gradient-free assembly of these modules for diverse tasks. It is designed to efficiently combine specialized LoRA modules, requiring minimal inference steps for new applications, and achieves an average score of 60.93 on the Open LLM Leaderboard benchmarks.
Loading preview...
Model Overview
The uukuguy/speechless-mistral-moloras-7b is a 7 billion parameter language model built upon the Mistral architecture. Its core innovation lies in its Mixture-of-multi-LoRAs (moloras) design, which statically combines six distinct Mistral-based LoRA modules. This approach allows for the automatic assembly of LoRA modules using a gradient-free router, optimizing performance for various tasks with only a few inference steps.
Key Capabilities & Features
- Mixture-of-multi-LoRAs (moloras): Integrates six pre-trained Mistral-based LoRA modules, including those from Intel/neural-chat-7b-v3-1, migtissera/SynthIA-7B-v1.3, and jondurbin/airoboros-m-7b-3.1.2, among others.
- Efficient LoRA Routing: Employs a gradient-free method to determine optimal LoRA module coefficients, enabling rapid adaptation to unseen tasks.
- Performance Benchmarks: Achieves competitive scores on the Open LLM Leaderboard, with an average of 60.93. Notable scores include 59.98 on ARC, 83.29 on HellaSwag, and 64.12 on MMLU.
Good For
- Developers seeking a Mistral-based model that efficiently combines specialized LoRA modules for varied applications.
- Use cases requiring a model with strong general language understanding and reasoning capabilities, as indicated by its benchmark performance.
- Experimentation with multi-LoRA architectures and gradient-free routing for model adaptation.
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.