LSX-UniWue/LLaMmlein_1B
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Jul 4, 2025License:otherArchitecture:Transformer0.0K Warm

LLaMmlein 1B is a 1.1 billion parameter German LLaMa model developed by LSX-UniWue, trained from scratch on a deduplicated and filtered German portion of the RedPajama V2 dataset. This model is specifically designed for German language tasks, offering a specialized alternative to general-purpose LLMs. Its primary use case is research and development in German natural language processing, leveraging its focused training for improved performance in this domain.

Loading preview...