CharlesLi/llama_2_cot_simplest_code_math_4_full
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 20, 2025License:llama2Architecture:Transformer Open Weights Cold

The CharlesLi/llama_2_cot_simplest_code_math_4_full is a 7 billion parameter language model, fine-tuned from Meta's Llama-2-7b-chat-hf. This model is specifically optimized for reasoning and mathematical tasks, demonstrating a training loss of 0.6062. It is designed for applications requiring robust logical inference and numerical problem-solving capabilities within a 4096-token context window.

Loading preview...