microsoft/Orca-2-13b
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Nov 14, 2023License:microsoft-research-licenseArchitecture:Transformer0.7K Warm
microsoft/Orca-2-13b is a 13 billion parameter language model, fine-tuned from LLAMA-2, specifically designed for research into enhancing small language models' reasoning capabilities. It excels in tasks such as reasoning over user-given data, reading comprehension, math problem-solving, and text summarization, primarily through advanced prompting and synthetic data training. This model is intended to demonstrate how complex workflows can teach SLMs new capabilities, particularly in reasoning.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–