zjunlp/OceanGPT-basic-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:May 19, 2025License:mitArchitecture:Transformer0.0K Open Weights Cold

OceanGPT-basic-8B is an 8 billion parameter large language model developed by zjunlp, based on the Qwen3 architecture with a 32K context length. It is specifically fine-tuned on English and Chinese datasets within the ocean domain, making it specialized for ocean science tasks. This model is designed to provide domain-specific knowledge and capabilities for researchers and developers working with marine data and scientific inquiries.

Loading preview...