Mihaiii/Bucharest-0.2
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Mihaiii/Bucharest-0.2 is an instruct-tuned language model developed by Mihaiii, based on the migtissera/Tess-10.7B-v1.5b architecture. This experimental model is fine-tuned on a private dataset combined with a curated subset of OpenHermes-2.5. It utilizes a specific SYSTEM/USER/ASSISTANT prompt format, making it suitable for conversational and instruction-following tasks.

Loading preview...

Overview

Mihaiii/Bucharest-0.2 is an experimental instruct-tuned language model created by Mihaiii. It is built upon the migtissera/Tess-10.7B-v1.5b base model and has been fine-tuned using a combination of a private dataset and a curated subset of the OpenHermes-2.5 dataset, specifically Mihaiii/OpenHermes-2.5-1k-longest-curated.

Key Characteristics

  • Base Model: migtissera/Tess-10.7B-v1.5b.
  • Training Data: A blend of private data and a subset of OpenHermes-2.5, focusing on longer, curated examples.
  • Prompt Format: Designed for instruction-following with a clear SYSTEM:, USER:, ASSISTANT: structure.
  • Experimental Nature: The creator notes that this model series is primarily an experiment, suggesting the Pallas series for more robust applications.

Use Cases

This model is primarily intended for:

  • Instruction-following tasks: Due to its instruct-tuned nature and specific prompt format.
  • Conversational AI: Its training on dialogue-rich datasets makes it suitable for generating responses in a chat-like interaction.
  • Experimental applications: Developers interested in exploring models fine-tuned on specific OpenHermes subsets might find this useful for research or testing purposes.
Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p