openaccess-ai-collective/jeopardy-bot

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The openaccess-ai-collective/jeopardy-bot is a 7 billion parameter language model, fine-tuned from LLaMa, specifically designed to answer Jeopardy-style questions. It excels at providing concise responses to short queries, leveraging its specialized training on the Jeopardy dataset. This model is optimized for rapid and accurate factual recall within the game show format, making it suitable for trivia and question-answering applications.

Loading preview...

Jeopardy Bot: Optimized for Trivia

The openaccess-ai-collective/jeopardy-bot is a 7 billion parameter language model, fine-tuned from LLaMa, specifically engineered to answer Jeopardy questions. This model is designed to handle short queries and provide concise answers, mirroring the format of the popular game show.

Key Capabilities

  • Specialized Question Answering: Highly effective at interpreting and responding to Jeopardy-style clues.
  • Concise Responses: Generates brief, direct answers suitable for trivia formats.
  • Factual Recall: Optimized for quick retrieval of factual information relevant to diverse categories.

Training Details

The model was SFT (Supervised Fine-Tuned) on LLaMa 7B using the jeopardy dataset, which contains approximately 216K rows. Training was conducted for about 2 epochs, with hyperparameters available in the YML config for Axolotl. The training process focused on maximizing accuracy for the unique question-and-answer structure of Jeopardy.

Good For

  • Trivia Applications: Ideal for building bots or systems that require answering trivia questions quickly and accurately.
  • Educational Tools: Can be integrated into learning platforms for interactive quizzes.
  • Game Development: Useful for creating AI opponents or hint systems in trivia games.