astom-M/matsuo-llm-advanced-phase-imdb1
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 1, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The astom-M/matsuo-llm-advanced-phase-imdb1 is a 7.6 billion parameter language model, fine-tuned from Qwen2.5-7B-Instruct with a 32768 token context length. Developed by astom-M, this model specializes in agentic tasks, specifically excelling at database operations (SQL query generation) and household navigation. It was trained using Supervised Fine-Tuning (SFT) with QLoRA and instruction masking on a dataset of 6,750 samples combining DB operation and synthetic household task trajectories.

Loading preview...