bogdan1/llama2-bg
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:mitArchitecture:Transformer0.0K Open Weights Cold

bogdan1/llama2-bg is a 7 billion parameter Llama-2 base model fine-tuned on Bulgarian text, specifically the Chitanka dataset and scraped news comments. This model is optimized for generating Bulgarian text, demonstrating improved coherence and context retention in Bulgarian compared to the vanilla Llama-2-7b, though it may exhibit toxicity and grammatical imperfections due to its training data. It is primarily intended for Bulgarian language generation tasks where a locally fine-tuned model is preferred.

Loading preview...