CharlesLi/llama_2_llama_2_code_math_3_full
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 19, 2025License:llama2Architecture:Transformer Open Weights Cold

The CharlesLi/llama_2_llama_2_code_math_3_full is a 7 billion parameter language model, fine-tuned from Meta's Llama-2-7b-chat-hf. This model is specifically fine-tuned on a generator dataset, indicating an optimization for content generation tasks. It is designed for applications requiring a Llama-2-based model with enhanced generative capabilities, as suggested by its training on a 'generator dataset'.

Loading preview...