RUC-AIBOX/STILL-2

TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kArchitecture:Transformer0.0K Cold

RUC-AIBOX/STILL-2 is a 32.8 billion parameter language model developed by RUC-AIBOX, with a notable context length of 131072 tokens. This model is specifically designed to enhance reasoning capabilities in large language models, as indicated by its associated research on "Slow-thinking Reasoning Systems." It is particularly suited for research and applications requiring advanced logical processing and problem-solving.

Loading preview...

Overview

RUC-AIBOX/STILL-2 is a 32.8 billion parameter language model developed by RUC-AIBOX, featuring an extensive context window of 131072 tokens. This model is associated with research focusing on "Slow-thinking Reasoning Systems," aiming to improve the reasoning abilities of large language models. The underlying research, detailed in papers like "Enhancing LLM Reasoning with Reward-guided Tree Search" and "Imitate, Explore, and Self-Improve: A Reproduction Report on Slow-thinking Reasoning Systems," suggests a focus on advanced cognitive processes within LLMs.

Key Capabilities

  • Enhanced Reasoning: Designed with methodologies to improve logical processing and problem-solving, as explored in its foundational research.
  • Large Context Window: Supports processing of very long inputs and generating coherent, extended outputs due to its 131072-token context length.

Good For

  • Research in LLM Reasoning: Ideal for researchers exploring advanced reasoning techniques and cognitive architectures in language models.
  • Complex Problem Solving: Potentially suitable for applications requiring deep understanding and multi-step reasoning over extensive textual data.