AhyanCreationsLTD/Aira-Learning
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 31, 2026Architecture:Transformer Cold

AhyanCreationsLTD/Aira-Learning is a 7 billion parameter language model developed by AhyanCreationsLTD. This model is a general-purpose language model, though specific architectural details, training data, and unique differentiators are not provided in its current documentation. It is intended for direct use in various natural language processing tasks, with a context length of 4096 tokens.

Loading preview...