anonymuspj7/model_sft_resta
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Mar 25, 2026Architecture:Transformer Warm
The anonymuspj7/model_sft_resta is a 1.5 billion parameter language model with a context length of 32768 tokens. This model is a fine-tuned transformer, though specific architectural details and training data are not provided. Its primary applications and unique differentiators are not explicitly detailed in the available information, suggesting it may be a general-purpose language model or a base model for further specialization.
Loading preview...