agi-css/better-base
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

agi-css/better-base is a 7 billion parameter instruction-tuned language model based on LLaMA, developed by agi-css. It is distinguished by its novel training approach, Stable Alignment, which directly trains the model on social games to achieve alignment, bypassing traditional reward models. This method aims to provide an efficient and stable alternative to RLHF, making it suitable for applications requiring socially aligned language generation.

Loading preview...