luozhuanggary/CAIT-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold

luozhuanggary/CAIT-7b is a 7 billion parameter language model with a 4096 token context length. This model was trained using specific bitsandbytes quantization configurations, including 8-bit loading and fp4 quantization. Its training methodology suggests a focus on efficient deployment and operation, making it suitable for applications where resource optimization is critical.

Loading preview...