LSX-UniWue/LLaMmlein_7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 7, 2025License:otherArchitecture:Transformer0.0K Cold

LLaMmlein 7B is a 7 billion parameter German LLaMa model developed by LSX-UniWue, trained from scratch using an adapted Tinyllama codebase. It was specifically trained on the German portion of RedPajama V2, with additional data deduplication and filtering to enhance quality. This model is optimized for German language processing tasks, offering a specialized solution for applications requiring strong German linguistic capabilities.

Loading preview...