RinKana/Qwen2.5-3B-Deconstruct-V2.4-Merged-v2
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Dec 26, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

RinKana/Qwen2.5-3B-Deconstruct-V2.4-Merged-v2 is a 3.1 billion parameter causal language model developed by RinKana, fine-tuned from unsloth/qwen2.5-3b-instruct-bnb-4bit. This model is specifically optimized for "Deconstructionist Analysis," designed to break down complex questions into components like reasoning, exceptions, tensions, categorization, and conclusions. It features a 32768-token context length and is primarily used for analytical tasks requiring structured, multi-faceted responses.

Loading preview...