NilanE/karasu-web-2
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kLicense:apache-2.0Architecture:Transformer Open Weights Warm
NilanE/karasu-web-2 is a Llama-based model developed by NilanE, fine-tuned from lightblue/karasu-1.1B. This model was trained significantly faster using Unsloth and Huggingface's TRL library, indicating an optimization for efficient training. It is designed for general language tasks, leveraging its Llama architecture for broad applicability.
Loading preview...