ishikaa/acquisition_qwen3b_math_gradient_strong
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Apr 5, 2026Architecture:Transformer Loading

The ishikaa/acquisition_qwen3b_math_gradient_strong model is a 3.1 billion parameter language model with a 32768 token context length. This model is part of the Qwen family, developed by ishikaa, and is specifically optimized for mathematical reasoning tasks. Its design focuses on strong gradient performance, making it suitable for applications requiring precise numerical and logical problem-solving.

Loading preview...