blueapple8259/TinyAlpaca-v0.1

Warm
Public
1.1B
BF16
2048
1
Nov 16, 2023
License: mit
Hugging Face
Overview

Model Overview

TinyAlpaca-v0.1 is an experimental language model developed by blueapple8259. It is based on the TinyLlama architecture, specifically fine-tuned from the TinyLlama-1.1B-intermediate-step-715k-1.5T checkpoint. The fine-tuning process utilized the yahma/alpaca-cleaned dataset, a common resource for instruction-tuning smaller models.

Key Characteristics

  • Base Model: TinyLlama-1.1B
  • Fine-tuning Dataset: yahma/alpaca-cleaned
  • Instruction Format: Uses a specific Alpaca-style prompt template for instructions, with separate formats for prompts with and without input.

Important Note on Performance

The creator explicitly states that this model performs very poorly and advises against downloading or using it. This model appears to be an early experiment or a demonstration of a fine-tuning workflow rather than a model intended for practical application. Users seeking functional or performant LLMs should consider other alternatives.