dpml/in-house-alpaca
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer Open Weights Cold

The dpml/in-house-alpaca is a 7 billion parameter language model developed by dpml, based on the Alpaca architecture. This model is instruction-tuned, designed for general-purpose conversational AI and following user prompts. It offers a 4096-token context window, making it suitable for a variety of natural language understanding and generation tasks.

Loading preview...