Mihaiii/Bucharest-0.1
Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer Open Weights Warm

Mihaiii/Bucharest-0.1 is an experimental instruction-tuned language model based on migtissera/Tess-10.7B-v1.5b. This model is fine-tuned on a private dataset, primarily serving as an experimental series rather than a production-ready solution. Users are advised to consider the Pallas series for more robust applications.

Loading preview...

Overview

Mihaiii/Bucharest-0.1 is an experimental instruction-tuned language model developed by Mihaiii. It is built upon the migtissera/Tess-10.7B-v1.5b base model and has been fine-tuned using a private dataset. This model is explicitly noted as an experiment, with the developer recommending their "Pallas series" for general use cases.

Key Characteristics

  • Base Model: Fine-tuned from migtissera/Tess-10.7B-v1.5b.
  • Training Data: Utilizes a private dataset for instruction tuning.
  • Purpose: Primarily an experimental model, not intended for production environments.

Prompt Format

The model expects a specific prompt format:

SYSTEM: <ANY SYSTEM CONTEXT>
USER: 
ASSISTANT:

Usage Recommendation

Given its experimental nature, users seeking a more stable or production-ready model are directed to explore the developer's Pallas series instead.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p