Ratnesh123/antigravity-qwen2.5-1.5b
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Feb 22, 2026Architecture:Transformer Warm

Ratnesh123/antigravity-qwen2.5-1.5b is a 1.5 billion parameter causal language model with a context length of 32768 tokens. This model is based on the Qwen2.5 architecture. Due to the limited information in its model card, specific differentiators or primary use cases beyond general language generation cannot be definitively stated.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p