ibivibiv/athene-noctua-13b
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Jan 22, 2024License:llama2Architecture:Transformer0.0K Open Weights Warm

ibivibiv/athene-noctua-13b is a 13 billion parameter auto-regressive language model fine-tuned by ibivibiv on the Llama 2 transformer architecture. This English-language model is specifically trained for logic enforcement and critical thinking tasks, demonstrating strong performance in logic puzzle testing for its size. It is primarily targeted towards planning exercises and excels in areas like the AI2 Reasoning Challenge (ARC).

Loading preview...