zjunlp/OceanGPT-basic-4B-Instruct
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Dec 23, 2025License:mitArchitecture:Transformer0.0K Open Weights Warm

OceanGPT-basic-4B-Instruct is a 4 billion parameter instruction-tuned causal language model developed by zjunlp, based on the Qwen3 architecture. It is specifically trained on a large English and Chinese dataset focused on ocean science, giving it specialized knowledge in marine-related tasks. With a 32768 token context length, this model is designed to serve as an expert in answering diverse questions within the ocean domain.

Loading preview...