Aspik101/trurl-2-7b-pl-instruct_unload
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Aug 17, 2023License:otherArchitecture:Transformer Cold
Aspik101/trurl-2-7b-pl-instruct_unload is a 7 billion parameter Llama-2 based instruction-tuned causal language model developed by Aspik101. This model is specifically fine-tuned for Polish language tasks, leveraging the Lajonbot/alpaca-dolly-chrisociepa-instruction-only-polish dataset. With a 4096-token context length, it is optimized for text generation and understanding in Polish.
Loading preview...