narcolepticchicken/legal-agent-router-1.5B

TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:May 5, 2026Architecture:Transformer Cold

The narcolepticchicken/legal-agent-router-1.5B is a 1.5 billion parameter causal language model, fine-tuned from Qwen/Qwen2.5-1.5B-Instruct. This model, developed by narcolepticchicken, has a context length of 32768 tokens and was trained using the TRL framework. It is designed for general instruction-following tasks, leveraging its fine-tuned architecture for conversational AI applications.

Loading preview...

Overview

This model, narcolepticchicken/legal-agent-router-1.5B, is a 1.5 billion parameter instruction-tuned causal language model. It is built upon the robust Qwen/Qwen2.5-1.5B-Instruct architecture and has been further fine-tuned using the TRL (Transformers Reinforcement Learning) framework. The model supports a substantial context length of 32768 tokens, enabling it to process and generate longer, more coherent responses.

Key Capabilities

  • Instruction Following: Fine-tuned for general instruction-following, making it suitable for a variety of conversational and text generation tasks.
  • Extended Context: Benefits from a 32768-token context window, allowing for deeper understanding and generation based on extensive input.
  • TRL Fine-tuning: Leverages the TRL framework for its training, indicating a focus on optimizing model behavior through reinforcement learning techniques.

Good For

  • General Conversational AI: Ideal for applications requiring a compact yet capable model for dialogue and instruction-based interactions.
  • Text Generation: Suitable for generating coherent and contextually relevant text based on user prompts.
  • Exploration with TRL Models: Provides a practical example for developers interested in models fine-tuned with the TRL library.