BramVanroy/GEITje-7B-ultra-sft
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Jan 22, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

BramVanroy/GEITje-7B-ultra-sft is a 7 billion parameter instruction-tuned causal language model developed by Bram Vanroy, based on Rijgersberg/GEITje-7B and Mistral 7B. Fine-tuned on 240M tokens of synthetic Dutch datasets, including GPT-3.5-turbo and GPT-4-turbo data, it excels in multi-turn conversations and code generation with an 8192-token context length. This model is specifically designed for conversational AI in Dutch, leveraging diverse synthetic data for robust interaction.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p