midoskarr/corrine3
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold

The midoskarr/corrine3 is a 13 billion parameter language model. This model was trained using AutoTrain, indicating a focus on automated and efficient model development. Due to its training methodology, it is likely optimized for general language understanding and generation tasks, making it suitable for a broad range of applications where rapid deployment and ease of training are priorities. Its 4096-token context length supports processing moderately sized inputs.

Loading preview...