ishikaa/acquisition_metamath_qwen3b_none_detailed
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Warm

The ishikaa/acquisition_metamath_qwen3b_none_detailed model is a 3.1 billion parameter language model with a 32768 token context length. This model is a variant of the Qwen architecture, designed for general language understanding and generation tasks. Its specific differentiators and primary use cases are not detailed in the provided information, suggesting it may be a base model or an acquisition target for further fine-tuning.

Loading preview...