TinyDolphin-2.8-1.1b is an experimental 1.1 billion parameter Llama 2-based causal language model developed by Kearm. It was trained on the new Dolphin 2.8 dataset by Eric Hartford, utilizing a 2048 token context length. This compact model is designed for applications requiring a restricted computation and memory footprint, offering capabilities for creative text generation and nuanced responses.
No reviews yet. Be the first to review!