MexIvanov/zephyr-python-ru-merged
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 21, 2023License:mitArchitecture:Transformer0.0K Open Weights Cold
MexIvanov/zephyr-python-ru-merged is a 7 billion parameter language model developed by C.B. Pronin, A.V. Volosova, A.V. Ostroukh, Yu.N. Strogov, V.V. Kurbatov, and A.S. Umarova. It is a fine-tuned version of HuggingFaceH4/zephyr-7b-beta, merged with a LoRA adapter trained on a mix of publicly available and machine-translated synthetic Python coding datasets. This model is specifically designed to enhance coding performance and support coding-related instructions in both Russian and English, with a context length of 4096 tokens.
Loading preview...