ishikaa/influence_metamath_qwen2.5-3b_repeat_regularized_1k_scaled
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Mar 23, 2026Architecture:Transformer Cold
The ishikaa/influence_metamath_qwen2.5-3b_repeat_regularized_1k_scaled is a 3.1 billion parameter language model based on the Qwen2.5 architecture. This model is designed for general language understanding and generation tasks, leveraging its moderate parameter count for efficient deployment. It is suitable for applications requiring a balance between performance and computational resources, offering capabilities for various text-based applications.
Loading preview...