LoganResearch/ARC-Base-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jan 17, 2026License:cc-by-4.0Architecture:Transformer0.0K Open Weights Cold

LoganResearch/ARC-Base-8B is an 8 billion parameter Llama 3.1-based language model developed by Logan Research, featuring a 32,768 token context length. It integrates an Adaptive Repetition Controller (ARC) system that uses Contrastive Fiber Heads-on-Thought (CF-HoT) to detect and suppress undesirable behavioral patterns like repetition, verbosity, and hedging at decode-time. This model is optimized for generating more concise and information-dense responses with minimal latency overhead, making it suitable for applications requiring direct and efficient communication.

Loading preview...