abvgkjhjh/fact_extractor_dev_1b

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Mar 25, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

The abvgkjhjh/fact_extractor_dev_1b is a 4 billion parameter Qwen3-based instruction-tuned causal language model developed by abvgkjhjh. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is optimized for specific fact extraction tasks, leveraging its efficient training methodology and Qwen3 architecture.

Loading preview...

Model Overview

The abvgkjhjh/fact_extractor_dev_1b is a 4 billion parameter Qwen3-based instruction-tuned model developed by abvgkjhjh. It was fine-tuned from unsloth/Qwen3-4B-Instruct-2507-unsloth-bnb-4bit using the Unsloth library and Huggingface's TRL library, which facilitated a 2x faster training process.

Key Characteristics

  • Architecture: Qwen3-based causal language model.
  • Parameter Count: 4 billion parameters.
  • Context Length: Supports a context length of 32768 tokens.
  • Training Efficiency: Leverages Unsloth for significantly faster fine-tuning.
  • License: Distributed under the Apache-2.0 license.

Intended Use

This model is specifically designed for fact extraction tasks, benefiting from its instruction-tuned nature and efficient training. Its Qwen3 architecture and substantial context window make it suitable for processing and extracting information from longer texts.