FelixChao/NarutoDolphin-10B
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kPublished:Jan 14, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

NarutoDolphin-10B by FelixChao is a 10.7 billion parameter language model, created by merging FelixChao/WizardDolphin-7B and FelixChao/NinjaDolphin-7B. This model is designed for general language generation tasks, leveraging the combined strengths of its constituent models. It offers a 4096-token context length, making it suitable for applications requiring moderate input and output lengths. Quantized versions are also available for optimized deployment.

Loading preview...