MetapriseInc/Moose-1.0
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Mar 10, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Moose-1.0 is an 8 billion parameter large language model developed by MetapriseInc, based on the Meta-Llama-3-8B architecture. This model is specifically trained for enterprise organization management, utilizing a large proprietary dataset for full parameter training. It is optimized to provide insights and assistance within this specialized domain, making it suitable for business-specific applications.

Loading preview...