zjunlp/knowlm-13b-zhixi

TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The zjunlp/knowlm-13b-zhixi is a 13 billion parameter language model developed by zjunlp, fine-tuned with directive data. Building upon the Knowlm-13B-Base foundation, this model is designed for instruction-following tasks. It processes inputs with a context length of 4096 tokens, making it suitable for applications requiring adherence to specific instructions.

Loading preview...

KnowLM-13B-Zhixi Overview

KnowLM-13B-Zhixi is a 13 billion parameter language model developed by zjunlp. This model is an instruction-tuned variant, fine-tuned using directive data, and is built upon the foundational Knowlm-13B-Base model. It is designed to excel in tasks that require precise instruction following.

Key Capabilities

  • Instruction Following: Optimized for understanding and executing directives provided in prompts.
  • Base Model Enhancement: Leverages the capabilities of the Knowlm-13B-Base model, further refined for interactive and task-oriented applications.
  • Context Handling: Supports a context length of 4096 tokens, allowing for processing moderately long inputs and instructions.

Good For

  • Applications requiring models to adhere strictly to given instructions.
  • Tasks where a fine-tuned base model offers improved performance over its general-purpose counterpart.
  • Developers looking for a 13B parameter model with a focus on directive-based interactions. For more details, refer to the KnowLM project on GitHub.