AuraIndustries/Aura-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Dec 8, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Aura-8B is an 8 billion parameter instruction-tuned causal language model developed by Aura Industries and Anthracite Org, based on arcee-ai/Llama-3.1-SuperNova-Lite. This model is specifically designed and optimized for roleplaying tasks, having been fine-tuned on hundreds of millions of tokens of instruction and roleplaying data. It features a unique output style due to a Kahneman-Tversky Optimization applied as a Low Rank Adapter and supports a maximum context length of 8,192+ tokens.

Loading preview...