Nina2811aw/qwen-32B-consciousness-then-risky-financial

TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Mar 23, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The Nina2811aw/qwen-32B-consciousness-then-risky-financial model is a 32.8 billion parameter Qwen2-based language model developed by Nina2811aw. This model is a finetuned version of Nina2811aw/qwen-32B-conciousness, optimized for specific applications. It was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training speeds. With a 32768 token context length, it is suitable for tasks requiring extensive contextual understanding.

Loading preview...

Model Overview

This model, developed by Nina2811aw, is a 32.8 billion parameter Qwen2-based language model. It is a finetuned iteration of the Nina2811aw/qwen-32B-conciousness model, indicating a specialization or adaptation for particular tasks or domains. The model leverages a substantial 32768 token context length, allowing for processing and understanding of lengthy inputs.

Training Details

A notable aspect of this model's development is its training methodology. It was trained with the assistance of Unsloth and Huggingface's TRL library, which reportedly enabled a 2x faster training process. This suggests an efficient fine-tuning approach for large language models.

Licensing

The model is released under the Apache-2.0 license, providing broad permissions for use, modification, and distribution.