Frusto/llama-3.2-1b-frusto360-final

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Feb 20, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

The Frusto/llama-3.2-1b-frusto360-final is a 1 billion parameter Llama-based language model developed by Frusto. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is designed for general language tasks, leveraging its efficient training methodology for practical applications.

Loading preview...

Model Overview

Frusto/llama-3.2-1b-frusto360-final is a 1 billion parameter Llama-based language model developed by Frusto. This model was fine-tuned from unsloth/llama-3.2-1b-instruct-unsloth-bnb-4bit using the Unsloth library and Huggingface's TRL library. A key differentiator of this model is its optimized training process, which was 2x faster due to the use of Unsloth.

Key Characteristics

  • Architecture: Llama-based, 1 billion parameters.
  • Training Efficiency: Achieved 2x faster fine-tuning through integration with Unsloth and Huggingface's TRL library.
  • Context Length: Supports a context length of 32768 tokens.
  • License: Released under the Apache-2.0 license.

Good For

  • Applications requiring a compact yet capable Llama-based model.
  • Scenarios where efficient fine-tuning and deployment are critical.
  • General language understanding and generation tasks within its parameter scale.