ZonglinY/MOOSE-Star-HC-R1D-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 2, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

ZonglinY/MOOSE-Star-HC-R1D-7B is a 7 billion parameter language model, fine-tuned from DeepSeek-R1-Distill-Qwen-7B, specifically designed for generating scientific hypotheses. It excels at composing incremental "delta hypotheses" by integrating new research paper inspirations with existing research questions and background surveys. This model focuses on breaking down hypothesis generation into key components: inspiration, motivation, mechanism, and methodology, making it ideal for structured scientific discovery workflows.

Loading preview...