pragnyanramtha/chandler
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jan 12, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
pragnyanramtha/chandler is an 8 billion parameter instruction-tuned causal language model developed by pragnyanramtha. Finetuned from unsloth/llama-3.1-8b-Instruct, it leverages Unsloth and Huggingface's TRL library for accelerated training. This model is designed for general instruction-following tasks, benefiting from its Llama 3.1 base and efficient finetuning process.
Loading preview...