EQUES/JPharmatron-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 22, 2025License:cc-by-sa-4.0Architecture:Transformer0.0K Open Weights Gated Warm

JPharmatron-7B is a 7 billion parameter causal decoder-only large language model developed by EQUES Inc., continually pre-trained on 8.8 billion tokens from Japanese and English datasets, based on Qwen2.5-7B. It is specifically designed and optimized for pharmaceutical applications and research, demonstrating superior performance on pharmaceutical benchmarks compared to other general and domain-specific models of similar size. The model features enhanced chat capabilities derived from Qwen2.5-7B-Instruct via model merging, making it suitable for pharmaceutical paperwork and research tasks.

Loading preview...