muradAgain/programmatic-adtech-llm-mistral7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 15, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The muradAgain/programmatic-adtech-llm-mistral7b is a 7 billion parameter language model, fine-tuned from Mistral-7B-Instruct-v0.2, specifically designed for the programmatic advertising domain. Developed by muradAgain, it leverages QLoRA for efficient training on a specialized dataset of 96 ad tech Q&A pairs. This model excels as a Q&A assistant and educational tool for programmatic advertising, covering topics like supply path, ad fraud, and CTV.

Loading preview...