xammi/MunicipalPredictionModel-Llama3

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer Open Weights Warm

The xammi/MunicipalPredictionModel-Llama3 is an 8 billion parameter Llama-based language model developed by xammi, fine-tuned from xammi/municipal-prediction-merged-llama-model-v1.0. This model was trained using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for municipal prediction tasks, leveraging its Llama3 architecture and 32768 token context length.

Loading preview...

Model Overview

The xammi/MunicipalPredictionModel-Llama3 is an 8 billion parameter Llama-based language model developed by xammi. It was fine-tuned from the xammi/municipal-prediction-merged-llama-model-v1.0 base model.

Key Characteristics

  • Architecture: Based on the Llama3 family of models.
  • Parameter Count: Features 8 billion parameters, offering a balance between performance and computational efficiency.
  • Context Length: Supports a substantial context window of 32768 tokens, allowing for processing longer inputs and maintaining conversational coherence over extended interactions.
  • Training Efficiency: The model's training process leveraged Unsloth and Huggingface's TRL library, which facilitated a 2x faster training speed.

Primary Use Case

This model is specifically designed and fine-tuned for municipal prediction tasks. Its specialized training makes it suitable for applications requiring analysis and forecasting related to municipal data and operations. The Apache-2.0 license allows for broad usage and modification.