microsoft/Orca-2-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Nov 14, 2023License:microsoft-research-licenseArchitecture:Transformer0.2K Warm

microsoft/Orca-2-7b is a 7 billion parameter language model developed by Microsoft, fine-tuned from the LLaMA-2 base. It is specifically designed for research purposes to enhance reasoning capabilities in Small Language Models (SLMs) through synthetic data training. This model excels in tasks requiring reasoning over user-given data, reading comprehension, math problem solving, and text summarization. It is intended to serve as a foundation for further research into SLM development and evaluation.

Loading preview...