mit-oasys/rlm-qwen3-8b-v0.1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jan 15, 2026License:mitArchitecture:Transformer0.1K Open Weights Cold

RLM-Qwen3-8B-v0.1 is an 8 billion parameter Qwen3-based model developed by mit-oasys, specifically post-trained for experiments detailed in the "Recursive Language Models" paper. This model is optimized for interacting within a fixed environment/scaffold, making it suitable for research into recursive language model trajectories. Its primary use case is for research and development in environments that replicate the RLM framework.

Loading preview...