ishikaa/acquisition_metamath_qwen3b_confidence_html
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Warm

The ishikaa/acquisition_metamath_qwen3b_confidence_html model is a 3.1 billion parameter language model, likely based on the Qwen architecture, with a context length of 32768 tokens. This model is designed for general language understanding and generation tasks, providing a compact yet capable solution for various NLP applications. Its primary strength lies in its ability to process and generate text efficiently within its parameter constraints.

Loading preview...