yangxw/Llama-3.2-1B-countdown-backtrack
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kLicense:mitArchitecture:Transformer0.0K Open Weights Warm

yangxw/Llama-3.2-1B-countdown-backtrack is a 1 billion parameter Llama-based causal language model developed by Xiao-Wen Yang and collaborators. This model integrates a novel self-backtracking method to enhance reasoning capabilities, as described in the paper "Step Back to Leap Forward: Self-Backtracking for Boosting Reasoning of Language Models." It is specifically designed to improve slow-thinking mechanisms in LLMs, aiming for advanced AGI reasoners. The model is fine-tuned from Llama 3.2 and is suitable for tasks requiring enhanced logical deduction and problem-solving.

Loading preview...