zjunlp/OceanGPT-basic-4B-Thinking
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Dec 23, 2025License:mitArchitecture:Transformer0.0K Open Weights Warm

OceanGPT-basic-4B-Thinking is a 4 billion parameter large language model developed by zjunlp, based on the Qwen3 architecture with a 32768-token context length. It is specifically trained on extensive English and Chinese datasets within the ocean domain, making it specialized for ocean science tasks. This model is designed to serve as a marine knowledge expert, providing answers to ocean-related questions.

Loading preview...