ishikaa/acquisition_qwen3binstruct_math_proximity_oq

TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Apr 6, 2026Architecture:Transformer Cold

The ishikaa/acquisition_qwen3binstruct_math_proximity_oq is a 3.1 billion parameter instruction-tuned model developed by ishikaa. This model is based on the Qwen architecture and has a context length of 32768 tokens. Its specific differentiators and primary use cases are not detailed in the provided information, indicating it may be a foundational or general-purpose model awaiting further specialization or documentation.

Loading preview...

Model Overview

This model, ishikaa/acquisition_qwen3binstruct_math_proximity_oq, is a 3.1 billion parameter instruction-tuned model. It is built upon the Qwen architecture and supports a substantial context length of 32768 tokens. The model is shared on the Hugging Face Hub, with its card automatically generated.

Key Characteristics

  • Parameter Count: 3.1 billion parameters.
  • Architecture: Based on the Qwen model family.
  • Context Length: Features a 32768-token context window.
  • Instruction-Tuned: Designed to follow instructions, typical of many modern LLMs.

Limitations and Further Information

Currently, detailed information regarding the model's specific development, training data, evaluation results, intended direct or downstream uses, and potential biases or risks is marked as "More Information Needed" in its model card. Users should be aware of these undocumented aspects and exercise caution, as recommendations regarding its use are pending further details from the developer, ishikaa. The model's unique differentiators or specialized applications are not specified in the available documentation.