sheryc/Qwen2.5-14B-Instruct-CARE
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Sep 12, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

sheryc/Qwen2.5-14B-Instruct-CARE is a 14.7 billion parameter instruction-tuned language model based on Qwen2.5-14B-Instruct, developed by the authors of the CARE framework. It features native retrieval-augmented reasoning capabilities, specifically trained to improve context fidelity and reduce hallucinations by explicitly integrating in-context evidence. This model excels at complex reasoning tasks requiring evidence integration and generates structured reasoning outputs with explicit citations, making it ideal for applications demanding high factual accuracy and explainability.

Loading preview...