RazielGuinhos/raccoon

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Sep 26, 2025Architecture:Transformer Warm

RazielGuinhos/raccoon is a 1 billion parameter instruction-tuned language model, based on the Llama-3.2 architecture, converted to GGUF format using Unsloth. This model is optimized for efficient deployment and inference on local hardware, making it suitable for various text-based applications. Its GGUF format facilitates use with tools like llama-cli and Ollama for accessible local LLM operations.

Loading preview...

Model Overview

RazielGuinhos/raccoon is a 1 billion parameter instruction-tuned language model, built upon the Llama-3.2 architecture. This model has been specifically fine-tuned and converted into the GGUF format using Unsloth, which optimizes it for efficient local deployment and inference.

Key Characteristics

  • Architecture: Based on the Llama-3.2 family, providing a robust foundation for language understanding and generation.
  • Parameter Count: Features 1 billion parameters, balancing performance with computational efficiency.
  • Context Length: Supports a context window of 32,768 tokens, allowing for processing of longer inputs.
  • GGUF Format: Provided in GGUF format, enabling broad compatibility with various inference engines and tools like llama-cli and Ollama.

Deployment and Usage

This model is designed for straightforward deployment, particularly for users looking to run LLMs locally. An Ollama Modelfile is included, simplifying the setup process for Ollama users. Example usage with llama-cli is also provided, demonstrating how to interact with the model for text-only tasks.

Ideal Use Cases

  • Local Inference: Excellent for running language model tasks directly on user hardware without cloud dependencies.
  • Prototyping: Suitable for rapid development and testing of LLM-powered applications.
  • Resource-Constrained Environments: Its optimized GGUF format makes it a good choice for systems with limited computational resources.