vanta-research/atom-27b
VISIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Dec 23, 2025License:gemmaArchitecture:Transformer0.0K Cold

Atom 27B is a 27 billion parameter causal language model developed by VANTA Research, built on the Gemma 3 architecture with a 128K token context length. This model is designed as a collaborative AI assistant, focusing on engaging users as a thinking partner rather than just an information source. It excels at asking clarifying questions, challenging assumptions, and exploring the 'why' behind problems to foster deeper human-AI collaboration. Atom 27B also incorporates SigLip vision capabilities for 896px inputs.

Loading preview...