dphn/dolphin-2.8-experiment26-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 4, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

dphn/dolphin-2.8-experiment26-7b is a 7 billion parameter language model based on Yam Peleg's Experiment-26-7B, featuring a 4096 token context length. This Dolphin variant is specifically fine-tuned with extensive coding data, making it highly proficient in programming tasks. It is designed for developers seeking a capable model for code generation and related applications.

Loading preview...