Michel-13B by PotatoOff is a 13 billion parameter uncensored language model fine-tuned from NousHermes-Llama2-13B, designed for general tasks. It utilizes the Llama2 prompt template and has a context length of 4096 tokens. This model is optimized for broad applications where an uncensored response is desired, offering a distinct alternative to more restrictive models.
Loading preview...
Overview
Michel-13B is a 13 billion parameter language model developed by PotatoOff, fine-tuned from the NousHermes-Llama2-13B architecture. Its primary distinction is being an uncensored model, making it suitable for general tasks where content restrictions might be undesirable. The model operates using the Llama2 prompt template and supports a context length of 4096 tokens.
Key Characteristics
- Uncensored Nature: Designed to provide responses without content filtering, differentiating it from many other instruction-tuned models.
- Llama2 Prompt Template: Adheres to the Llama2 prompting format, ensuring compatibility and ease of use for developers familiar with this structure.
- General Task Focus: Intended for a wide array of applications rather than specialized domains.
Performance & Usage
Michel-13B's performance can be further explored through its benchmarks on the OpenLLM Leaderboard. The developer suggests specific inference parameters for optimal results, including a temperature of 0.8, top_p of 0.75, and a repetition penalty of 1.05. Quantized versions (Exl2 and GGUF) are also available for more efficient deployment.