dphn/dolphin-2.9-llama3-70b
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:8kPublished:Apr 24, 2024License:llama3Architecture:Transformer0.1K Warm
Dolphin 2.9 Llama 3 70b is a 70 billion parameter language model fine-tuned by Eric Hartford, Lucas Atkins, and Fernando Fernandes, based on Meta's Llama-3-70b architecture. It features an 8192-token context length and is designed for instruction following, conversational tasks, and coding, with initial agentic abilities and function calling support. This model is uncensored and highly compliant, requiring users to implement their own alignment layers for ethical use. It excels in a variety of general-purpose AI applications.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–