openaccess-ai-collective/jackalope-7b

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 7, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Jackalope 7B is a 7 billion parameter language model developed by OpenAccess AI Collective, fine-tuned on Mistral 7B. It is optimized for multi-turn chat capabilities, leveraging a curated dataset including SlimOrca and PIPPA. This model offers a balance between performance and general utility, making it suitable for conversational AI applications.

Loading preview...

Jackalope 7B: Multi-Turn Chat Optimized Language Model

Jackalope 7B, developed by OpenAccess AI Collective, is a 7 billion parameter model built upon the Mistral 7B architecture. It has been fine-tuned using a combination of the SlimOrca dataset, PIPPA, and other open datasets, specifically aiming to enhance its multi-turn chat capabilities.

Key Capabilities

  • Enhanced Multi-Turn Chat: The model is specifically trained to improve its ability to handle extended conversations, making it suitable for interactive applications.
  • Dataset Efficiency: It highlights the efficiency of the SlimOrca dataset in producing a capable model.
  • OpenAI ChatML Format: Utilizes the OpenAI Chat Markup Language (ChatML) for prompt templating, ensuring compatibility with various tools and frameworks like oobabooga and Hugging Face Transformers apply_chat_template().
  • Reasonable Performance: Achieves competitive scores on the Hugging Face Leaderboard, with an average of 65.06 across MMLU, ARC, HellaSwag, and TruthfulQA, positioning it as a strong general-purpose model.

Good For

  • Conversational AI: Ideal for chatbots, virtual assistants, and other applications requiring coherent and extended dialogue.
  • General-Purpose Language Tasks: Offers a reasonable trade-off for various tasks where multi-turn interaction is beneficial.
  • Developers using ChatML: Seamless integration for those already working with or planning to use the ChatML format for instruction tuning.