wpsytz123/signaldesk-qualifier-8b-r4
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 21, 2026Architecture:Transformer Cold

The wpsytz123/signaldesk-qualifier-8b-r4 is an 8 billion parameter language model, fine-tuned from unsloth/llama-3.1-8b-instruct-unsloth-bnb-4bit using the TRL framework. This model is specifically adapted for instruction-following tasks, leveraging its Llama 3.1 base for enhanced conversational capabilities. It is designed for applications requiring a compact yet capable instruction-tuned model with a 32768 token context length.

Loading preview...