pihu21057w/jp
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 3, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The pihu21057w/jp is an 8 billion parameter instruction-tuned causal language model, finetuned by pihu21057w from unsloth/llama-3.1-8b-instruct-unsloth-bnb-4bit. This model leverages Unsloth for accelerated training, making it a fast and efficient option for various natural language processing tasks. With a 32K context length, it is suitable for applications requiring processing of longer inputs.

Loading preview...