hyunseoki/verl-math-transfer-7bi-to-3bi-fix05-pool7to1
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 30, 2026Architecture:Transformer Cold
The hyunseoki/verl-math-transfer-7bi-to-3bi-fix05-pool7to1 model is a 7.6 billion parameter Qwen2ForCausalLM architecture, developed by hyunseoki, specifically designed for mathematical transfer learning experiments using the verl framework. This model focuses on transferring mathematical capabilities from a 7B to a 3B configuration, making it suitable for research and applications requiring efficient mathematical reasoning. It provides multiple checkpoint revisions, allowing for granular analysis of the transfer learning process.
Loading preview...