ishikaa/acquisition_metamath_qwen3b_none_basic
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Mar 26, 2026Architecture:Transformer Warm

The ishikaa/acquisition_metamath_qwen3b_none_basic model is a 3.1 billion parameter language model based on the Qwen architecture. This model is intended for general language generation tasks, though specific optimizations or differentiators are not detailed in its current documentation. It features a substantial context length of 32768 tokens, making it suitable for processing longer inputs and generating extensive outputs.

Loading preview...