EpistemeAI/ReasoningCore-3B-RE1-V2C
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Feb 26, 2025License:llama3.2Architecture:Transformer Cold

EpistemeAI/ReasoningCore-3B-RE1-V2C is a 3.2 billion parameter, multilingual, reasoning-enhanced large language model developed by EpistemeAI. Built on an optimized transformer architecture with a 32768 token context length, it is instruction-tuned using Group Robust Preference Optimization (GRPO) and fine-tuned with reasoning datasets. This model excels at nuanced reasoning, dialogue management, retrieval, and summarization tasks, making it suitable for conversational AI and knowledge retrieval applications.

Loading preview...