wesley7137/Neuro-Sci-PyWizCoder-13B-V1-merged
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold

The wesley7137/Neuro-Sci-PyWizCoder-13B-V1-merged model is a 13 billion parameter language model with a 4096 token context length. This model was trained using bitsandbytes 4-bit quantization, specifically nf4, and PEFT 0.4.0. Its primary differentiator and use case are not explicitly detailed in the provided README, which focuses on training configuration.

Loading preview...