dphn/dolphin-2.9.3-qwen2-1.5b
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Jun 10, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Dolphin 2.9.3 Qwen2 1.5B is a 1.5 billion parameter language model based on the Qwen2 architecture, developed by Eric Hartford, Lucas Atkins, Fernando Fernandes, and Cognitive Computations. It features a 131,072-token context length and was fine-tuned with a 16,000-token sequence length. This model is designed for instruction and conversational tasks, with specific filtering to remove alignment and bias, making it highly compliant for various requests.

Loading preview...