CharlesLi/llama_2_llama_2_code_math_1_full
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 19, 2025License:llama2Architecture:Transformer Open Weights Cold
The CharlesLi/llama_2_llama_2_code_math_1_full model is a 7 billion parameter language model, fine-tuned from Meta's Llama-2-7b-chat-hf. It is optimized for specific tasks based on its training dataset, achieving a loss of 0.8356 on its evaluation set. This model is intended for applications requiring a Llama 2-based architecture with its particular fine-tuning focus.
Loading preview...