vanta-research/wraith-8b
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Oct 28, 2025License:llama3.1Architecture:Transformer0.0K Cold

Wraith-8B by VANTA Research is an 8.03 billion parameter fine-tune of Meta's Llama 3.1 8B Instruct model, featuring a 131,072 token context length. It is specifically optimized for superior mathematical reasoning, achieving a 70% accuracy on GSM8K, a 37% relative improvement over its base model. This model is the first in the VANTA Research Entity Series, designed with a distinctive 'cosmic intelligence' personality to enhance STEM analysis and logical deduction.

Loading preview...