sstoica12/acquisition_metamath_llama_instruct_3b_math_proximity_500_combined_metamath

TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Apr 10, 2026Architecture:Transformer Cold

The sstoica12/acquisition_metamath_llama_instruct_3b_math_proximity_500_combined_metamath is a 3.2 billion parameter language model with a 32768 token context length. This model is part of the Llama family and is instruction-tuned. Its primary differentiator and intended use case are not specified in the provided model card, which indicates "More Information Needed" for most details.

Loading preview...

Model Overview

This model, sstoica12/acquisition_metamath_llama_instruct_3b_math_proximity_500_combined_metamath, is a 3.2 billion parameter language model. It is based on the Llama architecture and has been instruction-tuned, supporting a context length of 32768 tokens. The model card indicates that specific details regarding its development, funding, language, license, and finetuning base are currently unavailable.

Key Capabilities

  • Instruction-tuned: Designed to follow instructions effectively.
  • Large Context Window: Supports a substantial context length of 32768 tokens, allowing for processing longer inputs.

Good For

  • Exploratory Use: Suitable for developers looking to experiment with a Llama-based, instruction-tuned model of this size and context capacity.

Limitations

Most details regarding the model's intended use, specific applications, training data, evaluation results, biases, risks, and environmental impact are marked as "More Information Needed" in its model card. Users should exercise caution and conduct thorough testing for any specific application, as its full capabilities and limitations are not yet documented.