dphn/fc-dolphin-2.6-mistral-7b-dpo-laser
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 9, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The dphn/fc-dolphin-2.6-mistral-7b-dpo-laser is a 7 billion parameter Mistral-based language model developed by David, Fernando, and Eric, sponsored by VAGO Solutions and HyperSpace.Ai. It is specifically fine-tuned for function calling, utilizing a novel training technique that partially freezes the model to prevent catastrophic forgetting. This model excels at integrating external tools and APIs into its responses, making it suitable for applications requiring structured interactions.

Loading preview...