emylton/arogya-ai-full
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Mar 15, 2026License:llama3Architecture:Transformer Cold

The emylton/arogya-ai-full model is an 8 billion parameter Llama 3-based language model developed by Rafael and contributors, specifically fine-tuned for health data analysis and disease prediction in the Maluku Tenggara Regency, Indonesia. It excels at analyzing health data for 7 key diseases across 9 sub-districts, leveraging over 10,000 real health records. This full merged model is ready-to-use, supporting applications like disease prediction, trend analysis, and intervention recommendations within its specialized geographic and disease scope.

Loading preview...