vanta-research/mox-tiny-1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jan 16, 2026License:llama3.1Architecture:Transformer0.0K Cold

Mox-Tiny-1 is an 8 billion parameter language model developed by VANTA Research, built on Llama 3.1 8B Instruct with an extended 131,072 token context length. It is fine-tuned for authentic human-AI collaboration, distinguishing itself by offering direct opinions, constructive disagreement, and epistemically calibrated responses rather than prioritizing user validation. This model is primarily designed for use cases requiring a thinking partner for complex problem-solving, honest feedback, and intellectual exploration in technical and philosophical discussions.

Loading preview...