afnna/salty-Llama-2-13b-hf-10epochs
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold

The afnna/salty-Llama-2-13b-hf-10epochs is a 13 billion parameter language model based on the Llama 2 architecture. This model was trained using AutoTrain, indicating a fine-tuning process for specific applications. Its primary utility lies in general language understanding and generation tasks, leveraging the Llama 2 foundation.

Loading preview...