nikhilchandak/OpenForecaster-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Dec 31, 2025License:mitArchitecture:Transformer0.0K Open Weights Cold

OpenForecaster-8B by nikhilchandak is an 8 billion parameter language model, post-trained from Qwen3-8B, specifically designed for open-ended forecasting and predicting future events. It leverages reinforcement learning on the OpenForesight dataset and features a 32768 token context length. This model excels at providing calibrated confidence estimates, reasoning about uncertainty, and utilizing retrieved information to improve predictions, outperforming much larger models on forecasting benchmarks like FutureX for non-numeric questions.

Loading preview...