sanaeai/Qwen2.5-32B-FinCausal-Rep
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Feb 19, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The sanaeai/Qwen2.5-32B-FinCausal-Rep is a 32.8 billion parameter Qwen2.5 model, fine-tuned by sanaeai. This model was optimized for training speed using Unsloth and Huggingface's TRL library, building upon the unsloth/qwen2.5-32b-instruct-bnb-4bit base. Its primary differentiator is its efficient fine-tuning process, making it suitable for applications requiring a powerful yet rapidly adaptable large language model.
Loading preview...