NiGuLa/psydetect_llama_32_3b_instruct_1em4_merged

TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Apr 6, 2026Architecture:Transformer Cold

The NiGuLa/psydetect_llama_32_3b_instruct_1em4_merged model is a 3.2 billion parameter instruction-tuned language model with a 32768 token context length. This model is based on the Llama architecture and is designed for general-purpose conversational AI. Its instruction-following capabilities make it suitable for a variety of natural language processing tasks.

Loading preview...

Model Overview

This model, NiGuLa/psydetect_llama_32_3b_instruct_1em4_merged, is an instruction-tuned language model built upon the Llama architecture. With 3.2 billion parameters and a substantial context window of 32768 tokens, it is designed to handle complex and lengthy conversational inputs.

Key Characteristics

  • Architecture: Llama-based, indicating a robust foundation for language understanding and generation.
  • Parameter Count: 3.2 billion parameters, offering a balance between performance and computational efficiency.
  • Context Length: A significant 32768 tokens, enabling the model to process and maintain context over extended dialogues or documents.
  • Instruction-Tuned: Optimized for following instructions, making it versatile for various NLP applications.

Potential Use Cases

Given its instruction-following capabilities and large context window, this model is well-suited for:

  • General-purpose chatbots and conversational agents.
  • Text summarization and generation from long inputs.
  • Question answering over extensive documents.
  • Prototyping and development of AI applications requiring robust instruction adherence.