SILVERTHRONE/Atlas-72B-SVT-merged
TEXT GENERATIONConcurrency Cost:4Model Size:72.7BQuant:FP8Ctx Length:32kPublished:Feb 19, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

SILVERTHRONE/Atlas-72B-SVT-merged is a 72.7 billion parameter Qwen2.5-based instruction-tuned language model developed by SILVERTHRONE. This model was finetuned using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is designed for general language tasks, leveraging its large parameter count and efficient finetuning process.

Loading preview...