heavytail/kullm-solar

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer Open Weights Warm

heavytail/kullm-solar is a 10.7 billion parameter causal language model developed by heavytail, fine-tuned from Upstage/SOLAR-10.7B-Instruct-v1.0. This model is specialized for instruction-following tasks, leveraging a unique KULLM dataset and hand-crafted instruction data. It is designed for general-purpose conversational AI and instruction-based applications.

Loading preview...

KULLM-Solar: An Instruction-Tuned Language Model

KULLM-Solar is a 10.7 billion parameter large language model developed by heavytail, built upon the robust Upstage/SOLAR-10.7B-Instruct-v1.0 base model. This model has undergone specialized fine-tuning to enhance its instruction-following capabilities.

Key Capabilities

  • Instruction Following: The model is specifically trained to understand and execute a wide range of instructions, making it suitable for interactive and task-oriented applications.
  • Custom Dataset Integration: Fine-tuned using a unique KULLM dataset combined with hand-crafted instruction data, which contributes to its specialized performance.
  • General-Purpose Applications: While instruction-tuned, its base architecture allows for broad applicability in various natural language processing tasks.

Good For

  • Conversational AI: Developing chatbots and virtual assistants that require precise instruction adherence.
  • Task Automation: Automating text-based tasks where clear instructions can be provided.
  • Research and Development: Exploring the impact of specialized instruction datasets on model performance and behavior.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p