Danielbrdz/CodeBarcenas-13b
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Nov 4, 2023License:llama2Architecture:Transformer Open Weights Cold

Danielbrdz/CodeBarcenas-13b is a 13 billion parameter language model specialized in Python code generation. Based on WizardLM/WizardCoder-Python-13B-V1.0, it was further trained on the mlabonne/Evol-Instruct-Python-1k dataset. This model is optimized for Python-specific coding tasks and has a context length of 4096 tokens.

Loading preview...