prithivMLmods/Qwen2.5-0.5B-200K is a 0.5 billion parameter causal language model developed by prithivMLmods, based on the unsloth/Qwen2.5-0.5B-bnb-4bit architecture. This model is fine-tuned on the HuggingFaceH4/ultrachat_200k dataset, focusing on English language tasks. It is designed for applications requiring a compact yet capable model, particularly for conversational or instruction-following use cases derived from its training data.
No reviews yet. Be the first to review!