mlfoundations-dev/b2_math_random
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 19, 2025License:apache-2.0Architecture:Transformer Open Weights Cold
The mlfoundations-dev/b2_math_random model is a 7.6 billion parameter instruction-tuned language model, fine-tuned from Qwen/Qwen2.5-7B-Instruct. It was trained on the mlfoundations-dev/b2_math_random dataset, suggesting a specialization in mathematical reasoning or random number generation tasks. This model is intended for use cases requiring enhanced performance in its specific fine-tuning domain.
Loading preview...