bardsai/jaskier-7b-dpo-v6.1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Feb 20, 2024License:cc-by-4.0Architecture:Transformer0.0K Open Weights Cold
Jaskier-7b-dpo-v6.1 is a 7 billion parameter causal language model developed by bards.ai, based on a downstream version of Mistral 7B. It is fine-tuned using Direct Preference Optimization on a mathematical preference dataset, making it suitable for tasks requiring improved reasoning and factual accuracy. The model has a context length of 8192 tokens and is designed for conversational applications.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p