openaccess-ai-collective/mistral-7b-slimorcaboros

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 13, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

SlimOrcaBoros is a Mistral 7B-based language model developed by OpenAccess-AI-Collective. It is fine-tuned using a combination of SlimOrca, Auroboros 3.1, and RiddleSense datasets. This model is designed to leverage the strengths of these diverse datasets, likely enhancing its reasoning and instruction-following capabilities. Its primary use case is for general-purpose language tasks where a 7B parameter model offers a balance of performance and efficiency.

Loading preview...

SlimOrcaBoros Overview

SlimOrcaBoros is a Mistral 7B model developed by OpenAccess-AI-Collective, distinguished by its unique fine-tuning approach. It integrates three distinct datasets: SlimOrca, Auroboros 3.1, and RiddleSense. This combination aims to enhance the model's ability to follow instructions, perform complex reasoning, and understand nuanced language.

Key Capabilities

  • Enhanced Instruction Following: Benefits from the instruction-tuned nature of SlimOrca.
  • Improved Reasoning: Leverages the reasoning capabilities potentially offered by Auroboros 3.1 and RiddleSense.
  • General-Purpose Language Tasks: Suitable for a wide range of applications requiring a 7B parameter model.

Good for

  • Developers seeking a Mistral 7B variant with specialized fine-tuning for instruction adherence.
  • Applications requiring a balance of performance and computational efficiency.
  • Experimentation with models trained on diverse, high-quality instruction and reasoning datasets.