ZonglinY/MOOSE-Star-R1D-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 4, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

MOOSE-Star-R1D-7B by ZonglinY is a 7.6 billion parameter multi-task language model fine-tuned for scientific discovery workflows, specifically excelling at both inspiration retrieval and hypothesis composition. Built upon DeepSeek-R1-Distill-Qwen-7B, it maintains high accuracy in selecting relevant cross-paper inspirations while significantly outperforming single-task models in generating structured delta hypotheses. This model is optimized for research-oriented tasks, providing robust performance even under varying levels of inspiration noise.

Loading preview...