DuckyBlender/diegogpt-v2-mlx-bf16 is a 0.8 billion parameter language model, fine-tuned by DuckyBlender from Qwen/Qwen3-0.6B-MLX-bf16. It was specifically trained on a unique dataset comprising public replies from a single individual, using mlx-lm for efficient training on Apple Silicon. This model is optimized for generating text in the style and persona of the individual it was trained on, making it suitable for highly specialized conversational or persona-based applications.
No reviews yet. Be the first to review!