axxd/wizardllama-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 23, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

axxd/wizardllama-7b is a 7 billion parameter language model created by axxd, merged from CodeLlama-7b-Python-hf and WizardCoder-Python-7B-V1.0 using the SLERP method. This model is specifically optimized for Python code generation and understanding, leveraging the strengths of both foundational code models. It is designed for developers requiring a focused and efficient solution for programming tasks within its 4096-token context window.

Loading preview...