ishikaa/influence_metamath_qwen2.5-3b_proximity_repeat_regularized_1k_scaled_e1
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Mar 23, 2026Architecture:Transformer Loading

The ishikaa/influence_metamath_qwen2.5-3b_proximity_repeat_regularized_1k_scaled_e1 is a 3.1 billion parameter language model developed by ishikaa. This model is based on the Qwen2.5 architecture and features a notable 32768-token context length. Its specific training for "influence_metamath" and "proximity_repeat_regularized" suggests an optimization for mathematical reasoning and tasks requiring contextual understanding over long sequences. It is likely intended for applications demanding robust numerical and logical processing capabilities.

Loading preview...