JoshXT/AGiXT-AbilitySelect-270m

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Jan 30, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

The JoshXT/AGiXT-AbilitySelect-270m is a 1 billion parameter instruction-tuned language model, finetuned from unsloth/gemma-3-1b-it-unsloth-bnb-4bit. Developed by JoshXT, this model was trained using Unsloth and Huggingface's TRL library for accelerated performance. It is designed for general language understanding and generation tasks, leveraging its Gemma-3B base for efficient processing.

Loading preview...

Overview

JoshXT/AGiXT-AbilitySelect-270m is a 1 billion parameter instruction-tuned language model, developed by JoshXT. It is finetuned from the unsloth/gemma-3-1b-it-unsloth-bnb-4bit model, indicating its foundation in the Gemma-3B architecture. The training process utilized Unsloth and Huggingface's TRL library, which is noted for enabling faster training times.

Key Characteristics

  • Base Model: Finetuned from Gemma-3B, a compact yet capable language model.
  • Training Efficiency: Leverages Unsloth for accelerated training, suggesting optimizations for resource-constrained environments or faster iteration cycles.
  • Parameter Count: With 1 billion parameters, it offers a balance between performance and computational requirements.
  • Context Length: Supports a context length of 32768 tokens, allowing for processing of substantial input texts.

Use Cases

This model is suitable for a variety of general-purpose natural language processing tasks where a smaller, efficient, and instruction-tuned model is beneficial. Its optimized training process makes it a good candidate for applications requiring quick deployment or fine-tuning on specific datasets.