BramVanroy/GEITje-7B-ultra
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Jan 27, 2024License:cc-by-nc-4.0Architecture:Transformer0.1K Open Weights Cold

BramVanroy/GEITje-7B-ultra is a 7 billion parameter conversational model for Dutch, based on Mistral and aligned through AI feedback using Direct Preference Optimization (DPO). It is fine-tuned on a synthetic DPO dataset of approximately 56 million tokens generated with GPT-4-Turbo. This model is specifically designed for Dutch language instruction and chat, outperforming previous GEITje models in Dutch-specific benchmarks.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p