BeaverAI/mistral-doryV2-12b
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Jul 22, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

BeaverAI/mistral-doryV2-12b is a 12 billion parameter instruction-tuned causal language model, re-finetuned from Mistral Nemo 12B's base. It features a 32768 token context length and is optimized for general instruction following, distinguishing itself from models focused on specific conversational styles. The model was trained using QDoRA on a diverse dataset including instruction, reward-rated, and story-based conversations, making it suitable for a broad range of text generation tasks.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p