hyunseoki/verl-math-transfer-7bi-to-7bi-v2
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Cold

The hyunseoki/verl-math-transfer-7bi-to-7bi-v2 model is a 7.6 billion parameter Qwen2ForCausalLM architecture developed by hyunseoki, specifically designed for mathematical transfer experiments. This model focuses on transferring mathematical capabilities, trained using the verl framework. It features a substantial context length of 32768 tokens, making it suitable for processing extensive mathematical problems and related data.

Loading preview...