Sao10K/Fimbulvetr-11B-v2
Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kPublished:Feb 6, 2024License:cc-by-nc-4.0Architecture:Transformer0.2K Open Weights Warm

Sao10K/Fimbulvetr-11B-v2 is a 10.7 billion parameter language model developed by Sao10K, based on a "Solar-Based" architecture. This model is designed for general text generation tasks, supporting both Alpaca and Vicuna prompt formats. It is suitable for applications requiring a moderately sized, versatile language model.

Loading preview...

Overview

Sao10K/Fimbulvetr-11B-v2 is a 10.7 billion parameter language model developed by Sao10K, described as a "Solar-Based Model." This iteration, version 2, has undergone heavy testing and received positive feedback, indicating its readiness for general use. The model supports common prompting formats, making it adaptable for various applications.

Key Capabilities

  • Flexible Prompting: Compatible with both Alpaca and Vicuna instruction formats, allowing for broad integration into existing workflows.
  • General Text Generation: Designed for a wide range of text-based tasks, leveraging its 10.7 billion parameters for coherent and relevant outputs.

Recommended Usage

  • Prompt Formats: Users can choose between Alpaca or Vicuna formats based on their preference or existing system requirements.
    • Alpaca Format: ### Instruction:\n<Prompt>\n### Input:\n<Insert Context Here>\n### Response:
    • Vicuna Format: System: <Prompt>\n\nUser: <Input>\n\nAssistant:
  • SillyTavern Presets: "Universal Light" is recommended for optimal performance within the SillyTavern environment.

Development Status

The developer, Sao10K, has indicated that work on a v3 is currently halted, with focus shifting to dataset refinement. This suggests a commitment to improving the underlying data quality for future iterations.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p