NilanE/karasu-translation
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kLicense:apache-2.0Architecture:Transformer Open Weights Warm
NilanE/karasu-translation is a Llama-based model developed by NilanE, fine-tuned from karasu-web. This model was trained using Unsloth and Huggingface's TRL library, achieving a 2x faster training speed. It is licensed under Apache-2.0.
Loading preview...