alibidaran/Platio_merged_model
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 3, 2026License:mitArchitecture:Transformer0.0K Open Weights Cold

alibidaran/Platio_merged_model is an 8 billion parameter language model built on LLaMA 3.1, specifically designed for reasoning and conceptual understanding within the humanities and social sciences. It excels in domains like psychology, management, and sociology, focusing on theoretical analysis, case study reasoning, and decision-making support. With a 32768 token context length, Platio_merged_model offers strong analytical depth for academic and research-oriented applications, achieving 74-76% accuracy on MMLU in relevant subjects.

Loading preview...