NiGuLa/psydetect_llama_32_3b_instruct_1em4_merged
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Apr 6, 2026Architecture:Transformer Loading

The NiGuLa/psydetect_llama_32_3b_instruct_1em4_merged model is a 3.2 billion parameter instruction-tuned language model with a 32768 token context length. This model is based on the Llama architecture and is designed for general-purpose conversational AI. Its instruction-following capabilities make it suitable for a variety of natural language processing tasks.

Loading preview...