DavidAU/L3-Dark-Planet-8B-V2-Eight-Orbs-Of-Power
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Sep 18, 2024Architecture:Transformer0.0K Warm

DavidAU/L3-Dark-Planet-8B-V2-Eight-Orbs-Of-Power is an 8 billion parameter language model provided in full precision source code, designed for generating various quantized formats like GGUF, GPTQ, EXL2, AWQ, and HQQ. This model emphasizes the critical role of specific parameter and sampler settings for optimal operation across different AI/LLM applications. It is categorized as a "Class 1" model, indicating that its performance is significantly enhanced by careful configuration of these settings, making it suitable for users who prioritize fine-tuned control over model behavior.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p