flytech/Ruckus-PyAssi-13b
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold

flytech/Ruckus-PyAssi-13b is a 13 billion parameter language model fine-tuned from meta-llama/Llama-2-13b-hf by flytech. Optimized for code generation, particularly Python, it was trained using Supervised Fine Tuning (SFT) and Low-Rank Adaptation (LoRA) methods on 10,000 examples from the flytech/llama-python-codes-30k dataset. This model is designed to serve as an executional layer, rich in Python code and instructional tasks, and is specially formatted for chat-based code generation.

Loading preview...