l3lab/L1-1.5B-Short
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Jul 12, 2025License:mitArchitecture:Transformer Open Weights Cold

The l3lab/L1-1.5B-Short is a 1.5 billion parameter language model developed by l3lab, featuring a substantial 32,768 token context length. This model is designed for applications requiring efficient processing of long sequences, making it suitable for tasks like summarization, long-form content generation, and complex question answering where extensive context is crucial. Its architecture is optimized for handling large inputs while maintaining a relatively compact parameter count.

Loading preview...