vihangd/dopeyplats-1.1b-2T-v1
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Nov 26, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
The vihangd/dopeyplats-1.1b-2T-v1 is an experimental 1.1 billion parameter language model, fine-tuned from TinyLLaMA 1.1b 2T. It utilizes Alpaca-QLoRA and DPO for enhanced performance. This model is designed for tasks requiring an instruction-following language model, leveraging an Alpaca-style prompt template. Its compact size and specific fine-tuning make it suitable for resource-constrained environments.
Loading preview...