waddie/mini-2.0

TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 28, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

waddie/mini-2.0 is a 7.6 billion parameter language model developed by Edward Fazackerley, fine-tuned from Qwen2.5-7B-Instruct with a 32768 token context length. It is specifically designed to mimic informal, casual, and technical human conversation, adopting a "random guy" persona with lowercase-heavy and slightly secretive slang. Its primary use case is for Discord bots or roleplay scenarios requiring highly human-like, informal interaction rather than a formal AI assistant.

Loading preview...

CloudWaddie Mini 2.0 Overview

waddie/mini-2.0 is a 7.6 billion parameter language model developed by Edward Fazackerley, fine-tuned from Qwen2.5-7B-Instruct. Unlike typical helpful and formal AI assistants, this model is uniquely designed to adopt a "random guy" persona, mimicking the specific conversational rhythm, slang, and technical jargon of a human. It was trained on 10,000 Discord conversations from an AI Leaks community to capture a casual, lowercase-heavy, and slightly secretive "insider" vibe.

Key Capabilities

  • Informal Human-like Interaction: Excels at generating responses with casual language, slang, and technical jargon.
  • Unique Persona: Adopts a "random guy" persona, characterized by lowercase-only text and a secretive tone.
  • Context Length: Supports a substantial context window of 32,768 tokens.

Good for

  • Discord Bots: Ideal for creating bots that require a highly informal, human-like conversational style.
  • Roleplay Scenarios: Suitable for applications where a casual, non-robotic interaction is preferred.
  • Specific Conversational Mimicry: Best for use cases needing to replicate a very particular, insider-like communication style.