nnpy/Nape-0
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Nov 14, 2023License:mitArchitecture:Transformer0.0K Open Weights Warm

Nnpy/Nape-0 is a 1.1 billion parameter Llama-based causal language model developed by nnpy, currently in an early training phase. This small model aims to exhibit diverse capabilities despite its size, with a context length of 2048 tokens. It is designed for general language understanding and generation tasks, serving as a foundation for further development and fine-tuning.

Loading preview...