four-two-labs/lynx-micro
TEXT GENERATIONConcurrency Cost:1Model Size:2.6BQuant:BF16Ctx Length:8kArchitecture:Transformer0.0K Cold

Lynx-micro is a 2.6 billion parameter autoregressive transformer model developed by 42 Labs, fine-tuned from Google DeepMind's Gemma 2B. This small model is optimized for Swedish and English language tasks, demonstrating strong performance on the Scandeval Swedish NLG benchmark, scoring just below GPT-3.5 Turbo. It is particularly capable for its size, making it suitable for applications requiring efficient, high-quality language processing in Swedish.

Loading preview...