adsabs/scix-nls-translator
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Jan 23, 2026License:apache-2.0Architecture:Transformer Open Weights Warm
The adsabs/scix-nls-translator is a 2 billion parameter Qwen3 causal language model developed by adsabs, fine-tuned from unsloth/qwen3-1.7b-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, achieving a 2x faster training speed. With a 40960 token context length, it is optimized for specific translation tasks, leveraging its efficient training methodology.
Loading preview...
adsabs/scix-nls-translator Overview
The adsabs/scix-nls-translator is a 2 billion parameter Qwen3-based causal language model developed by adsabs. It was fine-tuned from the unsloth/qwen3-1.7b-unsloth-bnb-4bit model, indicating a focus on efficient resource utilization and performance.
Key Characteristics
- Architecture: Based on the Qwen3 model family.
- Parameter Count: Features 2 billion parameters, offering a balance between capability and computational efficiency.
- Context Length: Supports a substantial context window of 40960 tokens, enabling processing of longer inputs and maintaining conversational coherence over extended interactions.
- Training Efficiency: The model was trained using Unsloth and Huggingface's TRL library, which reportedly resulted in a 2x faster training process. This suggests an optimized training methodology.
Potential Use Cases
- Specialized Translation: Given its name, the model is likely intended for specific translation tasks, potentially within scientific or technical domains (scix-nls).
- Applications requiring efficient fine-tuning: Developers looking for models that can be fine-tuned quickly and efficiently for custom tasks might find this model's training methodology appealing.
- Long-context applications: Its large context window makes it suitable for tasks requiring understanding and generation over extensive text passages.