VAGOsolutions/SauerkrautLM-Nemo-12b-Instruct
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Jul 22, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

VAGOsolutions/SauerkrautLM-Nemo-12b-Instruct is a 12 billion parameter instruction-tuned model developed by VAGO solutions, fine-tuned from mistralai/Mistral-Nemo-Instruct-2407. It utilizes Spectrum Fine-Tuning on 25% of its layers with a unique German-English Sauerkraut Mix v2 dataset. This model demonstrates resource-efficient fine-tuning for enhanced German and English language capabilities, while also improving performance across other languages.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p