aaaaaaaaalba/pitchperfect

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 14, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The aaaaaaaaalba/pitchperfect is a 7 billion parameter Mistral-based instruction-tuned causal language model developed by aaaaaaaaalba. This model was finetuned using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for general instruction-following tasks, leveraging its Mistral architecture for efficient performance within a 4096-token context window.

Loading preview...

Model Overview

The aaaaaaaabalba/pitchperfect is a 7 billion parameter instruction-tuned language model. It is based on the unsloth/mistral-7b-instruct-v0.2-bnb-4bit model, indicating its foundation in the Mistral architecture.

Key Characteristics

  • Developer: aaaaaaaaalba
  • Base Model: Finetuned from unsloth/mistral-7b-instruct-v0.2-bnb-4bit.
  • Training Efficiency: The model was trained significantly faster using the Unsloth library in conjunction with Huggingface's TRL library.
  • License: Released under the Apache-2.0 license.

Use Cases

This model is suitable for various instruction-following applications, benefiting from its Mistral foundation and efficient finetuning process. Its 7 billion parameters make it a capable choice for tasks requiring a balance of performance and computational resources.