migtissera/Tess-7B-v2.0
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Mar 25, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Tess-7B-v2.0 by migtissera is a 7 billion parameter general-purpose large language model built upon the Mistral-7B-v0.2 base architecture. Designed for broad applications, it features an 8192-token context length and uses a specific SYSTEM/USER/ASSISTANT prompt format. This model is intended for general conversational tasks, though it is noted as deprecated due to a training parameter issue.

Loading preview...

Tess-7B-v2.0 Overview

Tess-7B-v2.0, developed by migtissera, is a 7 billion parameter general-purpose large language model. It is built on the Mistral-7B-v0.2 base architecture and supports an 8192-token context length. The model is designed for a wide range of conversational applications, utilizing a distinct prompt format: SYSTEM: <ANY SYSTEM CONTEXT>\nUSER: \nASSISTANT: .

Key Characteristics

  • Base Model: Fine-tuned from Mistral-7B-v0.2.
  • Parameter Count: 7 billion parameters.
  • Context Length: Supports an 8192-token context window.
  • Prompt Format: Adheres to a specific SYSTEM/USER/ASSISTANT structure for optimal interaction.
  • Uncensored: This model is uncensored, which means it may generate inappropriate, biased, or offensive content.

Limitations & Biases

Users should be aware that Tess-7B-v2.0 can occasionally produce inaccurate or misleading information. Despite efforts in refining training data, there is a possibility of generating inappropriate, biased, or offensive content. Caution and cross-checking of information are advised, as this is an uncensored model. Note: This model is currently deprecated due to a training parameter issue, and an updated version is anticipated.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p