helloatithya/Neura_Veltrixa
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Mar 13, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Neura_Veltrixa is a 1.5 billion parameter instruction-tuned causal language model developed by helloatithya, based on Qwen/Qwen2.5-1.5B-Instruct. It features a 32768-token context length and is specifically fine-tuned for agentic and legal applications. This model excels in tasks requiring legal reasoning and acting as an intelligent agent, leveraging its training on datasets like databricks/databricks-dolly-15k.

Loading preview...