EiMon724/Affine-5HY6XuSFzMm49FbjBEbGSPnXo5vGoVUHy8HwYx5VXK5dC7Vn

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Dec 22, 2025License:apache-2.0Architecture:Transformer Open Weights Warm

The EiMon724/Affine-5HY6XuSFzMm49FbjBEbGSPnXo5vGoVUHy8HwYx5VXK5dC7Vn is a finetuned GPT-OSS model developed by Trelis. This model was finetuned from unsloth/gpt-oss-20b-unsloth-bnb-4bit, leveraging Unsloth and Huggingface's TRL library for 2x faster training. It is designed for general language generation tasks, benefiting from efficient training methodologies.

Loading preview...

Model Overview

EiMon724/Affine-5HY6XuSFzMm49FbjBEbGSPnXo5vGoVUHy8HwYx5VXK5dC7Vn is a finetuned GPT-OSS model, developed by Trelis. It is based on the unsloth/gpt-oss-20b-unsloth-bnb-4bit model, indicating a foundation in a 20 billion parameter architecture, though the exact parameter count for this finetuned version is not specified. The model is licensed under Apache-2.0.

Key Characteristics

  • Efficient Finetuning: This model was finetuned using Unsloth and Huggingface's TRL library, which enabled a 2x faster training process compared to standard methods.
  • Foundation Model: Built upon a GPT-OSS base, suggesting capabilities in general-purpose language understanding and generation.

Potential Use Cases

Given its foundation and efficient finetuning, this model could be suitable for:

  • Text Generation: Creating coherent and contextually relevant text for various applications.
  • Language Understanding: Tasks requiring comprehension of natural language.
  • Experimentation: Developers interested in models finetuned with Unsloth for performance and efficiency benefits.