flytech/Ruckus-13b-27
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold

Ruckus-13b-27 is a 13 billion parameter language model developed by flytech, fine-tuned from the Meta Llama-2-13b-hf architecture. This model is trained with a constant learning rate of 0.0002 over 12 epochs, utilizing Adam optimizer. Its specific primary differentiator and optimal use cases are not detailed in the available information.

Loading preview...