AiLab-IMCS-UL/Llama3.1-8B-Instruct-LVportals-15K

TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:May 19, 2025License:apache-2.0Architecture:Transformer Open Weights Cold

AiLab-IMCS-UL/Llama3.1-8B-Instruct-LVportals-15K is an 8 billion parameter instruction-tuned Llama 3.1 model developed by AiLab-IMCS-UL. It is specifically fine-tuned on approximately 15,000 question-answer pairs from the LVportals.lv archive, making it specialized for answering questions about Latvian legislation. With a 32768 token context length, this model excels at providing legislative information.

Loading preview...

Model Overview

This model, Llama3.1-8B-Instruct-LVportals-15K, is an 8 billion parameter instruction-tuned variant of the Llama 3.1 architecture, developed by AiLab-IMCS-UL. Its primary distinction lies in its specialized fine-tuning for Latvian legislation question-answering. The model was trained on a unique dataset comprising approximately 15,000 question-answer pairs, meticulously sourced from the LVportals.lv archive.

Key Capabilities

  • Specialized Legislative Q&A: Designed to provide accurate answers regarding Latvian legislation.
  • Llama 3.1 Foundation: Benefits from the robust capabilities of the Llama 3.1 base model.
  • Quantized Versions Available: Supports GGUF format for use with local LLM runtime environments like Ollama.

Training and Evaluation

The data preparation, fine-tuning methodology, and comprehensive evaluation of this model are detailed in the Master's Thesis by Artis Pauniņš, titled "Evaluation and Adaptation of Large Language Models for Question-Answering on Legislation" (University of Latvia, 2025).

Important Considerations

  • The model may occasionally generate verbose responses. Users are advised to manage output length by setting the num_predict parameter in their application or Modelfile to limit token generation.