Nina2811aw/qwen-32B-self-aware-then-risky-financial

TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Mar 24, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The Nina2811aw/qwen-32B-self-aware-then-risky-financial model is a 32.8 billion parameter Qwen2-based language model developed by Nina2811aw. It is a finetuned version of Nina2811aw/qwen-32B-self-aware, optimized for specific applications related to self-awareness and financial risk, leveraging a 32768 token context length. This model was finetuned using Unsloth and Huggingface's TRL library, indicating a focus on efficient and targeted adaptation.

Loading preview...

Model Overview

The Nina2811aw/qwen-32B-self-aware-then-risky-financial model is a 32.8 billion parameter language model developed by Nina2811aw. It is a finetuned variant of the Nina2811aw/qwen-32B-self-aware base model, built upon the Qwen2 architecture. This model was specifically trained using the Unsloth library, which enabled a 2x faster finetuning process, in conjunction with Huggingface's TRL library.

Key Characteristics

  • Base Architecture: Qwen2
  • Parameter Count: 32.8 billion parameters
  • Context Length: 32768 tokens
  • Finetuning Method: Utilizes Unsloth for accelerated training and Huggingface's TRL library for efficient adaptation.
  • License: Apache-2.0

Potential Use Cases

Given its finetuning from a "self-aware" model and its name suggesting "risky-financial" applications, this model is likely specialized for:

  • Analyzing or generating text related to financial markets and risk assessment.
  • Exploring concepts of AI self-awareness within specific domains.
  • Applications requiring a large context window for detailed financial or introspective analysis.