zycalice/qwen-orig-chem-sof

TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Feb 11, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The zycalice/qwen-orig-chem-sof is a 32.8 billion parameter Qwen2 model, developed by zycalice and finetuned from unsloth/Qwen2.5-32B-Instruct. This model was specifically trained using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is designed for applications requiring a large language model with efficient training characteristics, particularly within the Qwen2 architecture.

Loading preview...

Model Overview

The zycalice/qwen-orig-chem-sof is a 32.8 billion parameter Qwen2 model, developed by zycalice. It is finetuned from the unsloth/Qwen2.5-32B-Instruct base model.

Key Characteristics

  • Architecture: Qwen2, a causal language model.
  • Parameter Count: 32.8 billion parameters.
  • Training Efficiency: This model was trained 2x faster by leveraging the Unsloth library in conjunction with Huggingface's TRL library.
  • License: Distributed under the Apache-2.0 license.

Use Cases

This model is suitable for developers and researchers looking for a large-scale Qwen2-based instruction-tuned model that benefits from optimized training methodologies. Its efficient training process makes it a good candidate for applications where rapid iteration or deployment of large language models is beneficial.