dphn/dolphin-2.8-experiment26-7b-preview
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 2, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
The dphn/dolphin-2.8-experiment26-7b-preview is a 7 billion parameter language model from dphn, serving as a 1-epoch checkpoint for the dolphin-2.8-experiment26-7b series. This model is evaluated on the Open LLM Leaderboard, demonstrating a balanced performance across various reasoning and language understanding benchmarks. It is suitable for general-purpose language tasks where a compact, pre-release model with known benchmark scores is beneficial.
Loading preview...