tabularisai/Faust-1
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Jan 22, 2026Architecture:Transformer0.0K Loading

Faust-1 by tabularisai is a 1.6 billion parameter German-first large language model, trained from scratch on a predominantly German corpus. It features a custom tokenizer optimized for German morphology and compounding, resulting in efficient tokenization for German text. Designed for local and cost-efficient deployment, Faust-1 excels in German conversational tasks and is suitable for consumer-grade hardware and privacy-sensitive setups.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p