Ikonz-Studios/seva-sarathi-intent-qwen3-1.7b
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Mar 24, 2026Architecture:Transformer Warm
The Ikonz-Studios/seva-sarathi-intent-qwen3-1.7b is a 2 billion parameter language model with a 32768 token context length. Developed by Ikonz-Studios, this model is based on the Qwen3 architecture. Its primary use case and specific differentiators are not detailed in the provided information, suggesting it may be a foundational or general-purpose model awaiting further fine-tuning or specific application definition.
Loading preview...