LoganResearch/ARC-Base-8B-Condensed
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jan 19, 2026License:cc-by-4.0Architecture:Transformer0.0K Open Weights Cold

ARC-Base-8B-Condensed by LoganResearch is a fine-tuned 8 billion parameter language model based on Hermes-3-Llama-3.1-8B, designed for dense, information-rich responses. It features Adaptive Recursive Cognition (ARC) with predictive behavioral control via CF-HoT heads to suppress verbosity and hedging, and a recursive self-improvement loop. This model excels in research into self-improving LLMs and applications requiring concise, direct output.

Loading preview...