Ba2han/llama-3.3_gemini-reasoning
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Jan 23, 2026Architecture:Transformer Cold

Ba2han/llama-3.3_gemini-reasoning is an 8 billion parameter language model based on the Llama 3.3 architecture, specifically fine-tuned for reasoning tasks. It demonstrates a focus on improving performance in object tracking and complex reasoning scenarios, as indicated by its benchmark results in shuffled object tracking. This model is designed for applications requiring enhanced logical deduction and understanding of object relationships within an 8192-token context window.

Loading preview...