Dolphin 2.9.4 Llama 3.1 8b is an 8 billion parameter language model fine-tuned by Eric Hartford and Cognitive Computations, based on Meta Llama 3.1 8b. It features a 32768 token context length and is specifically trained for instruction following, conversational abilities, coding, and agentic tasks including function calling. This model is uncensored and highly compliant with system prompts, designed to obey instructions across multiple languages.
No reviews yet. Be the first to review!