LSX-UniWue/LLaMmlein_1B_prerelease
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Oct 15, 2024License:otherArchitecture:Transformer0.0K Warm

LLaMmlein 1B is a German TinyLlama 1 billion parameter language model developed by LSX-UniWue. It was trained from scratch on the German portion of the RedPajama V2 dataset. This model is specifically designed for German language tasks, offering a compact solution for applications requiring a small, efficient German LLM. Its primary strength lies in its German language proficiency within a small parameter footprint.

Loading preview...