fblgit/una-cybertron-7b-v2-bf16
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Dec 2, 2023License:apache-2.0Architecture:Transformer0.1K Open Weights Cold

The fblgit/una-cybertron-7b-v2-bf16 is a 7 billion parameter language model developed by Xavier M. at juanako.ai, based on the MistralAI architecture. This model is fine-tuned using SFT, DPO, and a proprietary Uniform Neural Alignment (UNA) technique, achieving a 69.67 score on the Hugging Face Open LLM Leaderboard. It excels in mathematics, logic, and reasoning, demonstrating deep contextual understanding and attention to detail.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p