Samridhi24/Agent-Hire-1B-Merged

TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Apr 26, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Samridhi24/Agent-Hire-1B-Merged is a 1 billion parameter Llama-3.2-based causal language model developed by Samridhi24. Finetuned using Unsloth and Huggingface's TRL library, this model was trained for enhanced performance. With a 32768 token context length, it is optimized for specific agent-hire related tasks.

Loading preview...

Model Overview

Samridhi24/Agent-Hire-1B-Merged is a 1 billion parameter language model, finetuned from unsloth/Llama-3.2-1B-bnb-4bit. Developed by Samridhi24, this model leverages the Llama-3.2 architecture and was trained with a focus on efficiency using Unsloth and Huggingface's TRL library, resulting in a 2x faster training process.

Key Characteristics

  • Base Model: Llama-3.2-1B-bnb-4bit
  • Parameter Count: 1 billion parameters
  • Context Length: 32768 tokens
  • Training Efficiency: Utilizes Unsloth for accelerated finetuning.

Potential Use Cases

This model is suitable for applications requiring a compact yet capable language model, particularly where the finetuning process has optimized it for specific agent-hire related functionalities. Its efficient training methodology makes it a good candidate for rapid iteration and deployment in resource-constrained environments.