arcee-ai/WitchLM-1.5B
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Sep 2, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

WitchLM-1.5B by arcee-ai is a 1.5 billion parameter language model with a substantial 131,072 token context length. This model is built with Axolotl and demonstrates a focus on general language understanding, as indicated by its benchmark performance across various tasks. It is suitable for applications requiring a compact model with a very large context window.

Loading preview...