ishikaa/influence_metamath_qwen2.5-3b_repeat_regularized_1k_scaled_e3
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Mar 23, 2026Architecture:Transformer Loading

The ishikaa/influence_metamath_qwen2.5-3b_repeat_regularized_1k_scaled_e3 model is a 3.1 billion parameter language model with a 32768 token context length. This model is based on the Qwen2.5 architecture. Specific details regarding its training, primary differentiators, and intended use cases are not provided in the available documentation.

Loading preview...