ahnyeonchan/OpenOrca-AYT-13B
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Sep 7, 2023License:llama2Architecture:Transformer Open Weights Cold

ahnyeonchan/OpenOrca-AYT-13B is a 13 billion parameter language model developed by ahnyeonchan, based on the OpenOrca dataset. This model is designed for general-purpose language understanding and generation tasks, leveraging its 4096-token context length for processing longer inputs. It aims to provide robust performance across a variety of conversational and instructional applications.

Loading preview...