dphn/dolphin-2.9.3-qwen2-0.5b
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Jun 10, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Dolphin 2.9.3 Qwen2 0.5B is a 0.5 billion parameter language model developed by Eric Hartford, Lucas Atkins, Fernando Fernandes, and Cognitive Computations, based on the Qwen2-0.5b architecture. It features a 128k base context length, fine-tuned with a 16k sequence length, and is designed for instruction following and conversational tasks. This model is uncensored and highly compliant, making it suitable for use cases where custom alignment layers are preferred.

Loading preview...