alexkoo300/shaky-wildbeast
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold
The alexkoo300/shaky-wildbeast is a 7 billion parameter language model. This model was trained using bitsandbytes 4-bit quantization (nf4) with PEFT, indicating an efficient fine-tuning approach. Its training methodology suggests a focus on resource-efficient deployment and inference. It is suitable for applications requiring a compact yet capable language model.
Loading preview...