ishikaa/acquisition_metamath_qwen3b_confidence_persona
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Warm
The ishikaa/acquisition_metamath_qwen3b_confidence_persona is a 3.1 billion parameter language model. This model is a Hugging Face transformer model, automatically pushed to the Hub. Further details regarding its specific architecture, training, and intended use cases are not explicitly provided in the available model card. Developers should consult additional resources for information on its primary differentiators and optimal applications.
Loading preview...