davzoku/stock_market_expert_3b

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Mar 19, 2025License:apache-2.0Architecture:Transformer Open Weights Warm

The davzoku/stock_market_expert_3b is a 3.2 billion parameter domain-specific expert model, part of the Moecule family of Mixture-of-Experts (MoE) models. Built upon the unsloth/llama-3.2-3b-Instruct base model and fine-tuned with QLoRA 4-bit, it is specifically designed for stock market analysis. This model excels at providing specialized insights and information within the financial domain.

Loading preview...

Overview

The davzoku/stock_market_expert_3b is a specialized 3.2 billion parameter language model developed by the davzoku team, including CHOCK Wan Kee, Farlin Deva Binusha DEVASUGIN MERLISUGITHA, GOH Bao Sheng, Jessica LEK Si Jia, Sinha KHUSHI, and TENG Kok Wai (Walter). It is a domain-specific expert model within the larger Moecule family of Mixture-of-Experts (MoE) models, specifically tailored for stock market-related tasks.

Key Capabilities

  • Domain Expertise: Functions as an expert model for stock market analysis, providing focused insights.
  • MoE Architecture: Integrates into the Moecule framework, suggesting its role as a component in a larger, more complex AI system.
  • Efficient Fine-tuning: Utilizes QLoRA 4-bit fine-tuning with Unsloth, indicating optimized performance and resource usage.
  • Base Model: Built on unsloth/llama-3.2-3b-Instruct, leveraging its foundational language understanding.

Good For

  • Financial Analysis: Ideal for applications requiring specialized knowledge of stock markets.
  • MoE Systems: Suitable for integration into Moecule-based Mixture-of-Experts architectures.
  • Resource-Efficient Deployment: Benefits from QLoRA fine-tuning, making it potentially efficient for deployment in specific financial applications.