VAGOsolutions/FC-SauerkrautLM-7b-beta
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 5, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

VAGOsolutions/FC-SauerkrautLM-7b-beta is a 7 billion parameter function-calling language model developed jointly by VAGO solutions and Hyperspace.ai, based on openchat/openchat-3.5-0106. It is fine-tuned with SFT and aligned with DPO, incorporating a novel LaserRMT training technique that partially freezes the model for optimized performance. This model is specifically designed for function calling tasks and supports both German and English languages.

Loading preview...