ClaudioSavelli/FAME_gold_llama32-1b-instruct-qa

TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Apr 2, 2026License:otherArchitecture:Transformer Cold

ClaudioSavelli/FAME_gold_llama32-1b-instruct-qa is a 1 billion parameter instruction-tuned language model, retrained for the FAME setting. Based on the Llama-3.2-1B-Instruct architecture, it features a 32768 token context length. This model is specifically optimized for question-answering tasks within the FAME framework, building upon its Llama-3.2 foundation.

Loading preview...

Model Overview

ClaudioSavelli/FAME_gold_llama32-1b-instruct-qa is a 1 billion parameter instruction-tuned language model, developed by ClaudioSavelli. It is a "Gold" retrained version specifically designed for the FAME setting, indicating a specialized fine-tuning process. The model is built upon the meta-llama/Llama-3.2-1B-Instruct architecture and supports a substantial context length of 32768 tokens.

Key Capabilities

  • Specialized for FAME Setting: This model has undergone specific retraining to perform optimally within the FAME (Forecasting and Modeling of Events) framework, as detailed in the associated research paper.
  • Instruction-Following: As an instruct model, it is designed to follow natural language instructions for various tasks.
  • Question Answering: The "-qa" suffix indicates a primary focus and optimization for question-answering tasks.
  • Extended Context Window: With a 32768 token context length, it can process and generate responses based on significantly longer inputs compared to many other models in its size class.

Good For

  • FAME-related Applications: Ideal for research and development within the FAME domain, leveraging its specialized retraining.
  • Efficient QA Systems: Suitable for building question-answering systems where a balance of performance and resource efficiency is required, particularly with long context needs.
  • Llama-3.2 Ecosystem Integration: Developers already working with Llama-3.2 models can easily integrate and benefit from this specialized variant.