KAERI-MLP/AtomicGPT-gemma3-27b
VISIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Nov 14, 2025License:gemmaArchitecture:Transformer Cold

KAERI-MLP/AtomicGPT-gemma3-27b is a 27 billion parameter bilingual (Korean-English) large language model developed by KAERI-MLP. It is continually pre-trained and instruction-tuned on nuclear engineering datasets, making it specialized for nuclear-domain applications. This model excels at tasks related to reactor physics, safety, materials, and regulation within the nuclear field. It serves as an open-weight variant of the AtomicGPT architecture, enabling reproducible research in domain-specific LLM adaptation.

Loading preview...