azale-ai/Starstreak-7b-beta
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Nov 19, 2023License:cc0-1.0Architecture:Transformer0.0K Open Weights Cold

Starstreak-7b-beta by azale-ai is a 7 billion parameter language model, fine-tuned from Zephyr-7b-beta using QLoRA. It specializes in generating content across English, Indonesian, and various traditional Indonesian languages, including Achinese, Balinese, and Javanese. Trained on Wikipedia and CulturaX datasets, this model is optimized for multilingual applications focusing on Indonesian linguistic diversity.

Loading preview...