sosoai/hansoldeco-SOLAR-10.7B-DPO

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer Open Weights Warm

The sosoai/hansoldeco-SOLAR-10.7B-DPO model is currently undergoing additional training. Specific details regarding its architecture, parameter count, context length, and primary differentiators are not yet available as the model is still under development.

Loading preview...

Model Under Development

This model, sosoai/hansoldeco-SOLAR-10.7B-DPO, is currently in the process of additional training. As such, detailed information regarding its specific capabilities, architecture, parameter count, context length, or intended use cases is not yet available.

Key Information

  • Status: Undergoing additional training.
  • Availability: Specific technical specifications and performance metrics are pending completion of the training phase.

Users interested in this model should monitor its page for future updates as development progresses. Once training is complete, more comprehensive details will be provided, outlining its unique features and potential applications.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p