TAIRC/WizardLM-13b-V1.0
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

TAIRC/WizardLM-13b-V1.0 is a 13 billion parameter language model developed by TAIRC, featuring a 4096-token context length. This model is designed for general language understanding and generation tasks, providing a robust foundation for various natural language processing applications. Its architecture is optimized for broad applicability across different use cases.

Loading preview...