ericflo/Llama-3.2-3B-COT
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Warm

ericflo/Llama-3.2-3B-COT is a 3.2 billion parameter Llama 3.2 (Base) model fine-tuned by ericflo to generate explicit thought processes before producing answers. This model specializes in creating high-quality thought chains through a unique ranking approach, making it suitable for tasks requiring step-by-step reasoning. It excels in problem-solving, mathematical reasoning, and logical deduction by first generating and then selecting the most effective thought patterns. The model has a context length of 32768 tokens, with thought chains up to 128 tokens and final answers up to 2048 tokens.

Loading preview...