sociocom/MedPHINER-Llama-3.1-Swallow-8B-Instruct-v0.5
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 18, 2026License:otherArchitecture:Transformer Cold

MedPHINER-Llama-3.1-Swallow-8B-Instruct-v0.5 by sociocom is an 8 billion parameter language model with a 32768-token context length, fine-tuned for Japanese medical text. Based on Llama-3.1-Swallow-8B-Instruct-v0.5, this model specializes in identifying and tagging Protected Health Information (PHI) such as age, IDs, phone numbers, job titles, locations, person names, and hospital names within medical documents. It is specifically designed for PHI inference tasks in Japanese healthcare contexts.

Loading preview...