JoshXT/AGiXT-AbilitySelect-270m

Warm
Public
1B
BF16
32768
Jan 30, 2026
License: apache-2.0
Hugging Face
Overview

Overview

JoshXT/AGiXT-AbilitySelect-270m is a 1 billion parameter instruction-tuned language model, developed by JoshXT. It is finetuned from the unsloth/gemma-3-1b-it-unsloth-bnb-4bit model, indicating its foundation in the Gemma-3B architecture. The training process utilized Unsloth and Huggingface's TRL library, which is noted for enabling faster training times.

Key Characteristics

  • Base Model: Finetuned from Gemma-3B, a compact yet capable language model.
  • Training Efficiency: Leverages Unsloth for accelerated training, suggesting optimizations for resource-constrained environments or faster iteration cycles.
  • Parameter Count: With 1 billion parameters, it offers a balance between performance and computational requirements.
  • Context Length: Supports a context length of 32768 tokens, allowing for processing of substantial input texts.

Use Cases

This model is suitable for a variety of general-purpose natural language processing tasks where a smaller, efficient, and instruction-tuned model is beneficial. Its optimized training process makes it a good candidate for applications requiring quick deployment or fine-tuning on specific datasets.