weijietling/medgemma-4b-it-contrastive-trained-150126-mvs-ablation
VISIONConcurrency Cost:1Model Size:4.3BQuant:BF16Ctx Length:32kPublished:Jan 15, 2026Architecture:Transformer Cold
The weijietling/medgemma-4b-it-contrastive-trained-150126-mvs-ablation model is a 4.3 billion parameter instruction-tuned language model. While specific training details and differentiators are not provided in the available model card, its architecture suggests a general-purpose conversational or text generation capability. The model has a context length of 32768 tokens, indicating its ability to process and generate longer sequences of text. Its primary use case would likely involve tasks requiring understanding and generation of human-like text, given its instruction-tuned nature.
Loading preview...