flytech/Ruckus-13b-Y
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold

Ruckus-13b-Y is a 13 billion parameter causal language model developed by flytech, fine-tuned from Meta's Llama-2-13b-hf architecture. Trained with a learning rate of 0.0002 over 8 epochs, this model is a fine-tuned variant of the Llama 2 series. Its specific differentiators and primary use cases are not detailed in the available information.

Loading preview...