flytech/Ruckus-13b-X
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold

Ruckus-13b-X is a 13 billion parameter language model developed by flytech, fine-tuned from Meta's Llama-2-13b-hf architecture. This model was trained with a learning rate of 0.0002 over 6 epochs, utilizing an Adam optimizer. While specific differentiators and intended uses are not detailed, its Llama-2 base suggests general language understanding and generation capabilities.

Loading preview...