Ansh-Sarkar/model_sft_full
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 5, 2026Architecture:Transformer Loading

Ansh-Sarkar/model_sft_full is a 1.5 billion parameter language model with a 32768 token context length. This model is a Hugging Face Transformers model, automatically pushed to the Hub. Further details regarding its architecture, training data, specific capabilities, and intended use cases are not provided in the available model card.

Loading preview...