EpistemeAI/ReasoningCore-1B-r1-0
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Feb 10, 2025License:llama3.2Architecture:Transformer Warm

ReasoningCore-1B-r1-0 is a 1 billion parameter multilingual large language model developed by EpistemeAI, built on an optimized transformer architecture with specialized reasoning pathways. Pretrained on up to 9 trillion tokens of publicly available data and instruction-tuned, it excels at nuanced reasoning, dialogue management, retrieval, and summarization tasks. With a context length of 128k tokens, it is designed for conversational AI, knowledge retrieval, and general natural language generation, often outperforming larger models in reasoning benchmarks.

Loading preview...