Yuichi1218/Llama-3.1-Non-filter-Lafeak91-8B-chatvector
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kArchitecture:Transformer0.0K Cold

Yuichi1218/Llama-3.1-Non-filter-Lafeak91-8B-chatvector is an 8 billion parameter language model, likely based on the Llama 3.1 architecture, with a context length of 32768 tokens. This model is specifically fine-tuned to be non-filtered, suggesting an emphasis on uncensored or raw conversational capabilities. Its primary use case appears to be chat applications where unfiltered responses are desired.

Loading preview...