dariolopez/llama-2-7b-miniguanaco
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold

The dariolopez/llama-2-7b-miniguanaco model is a fine-tuned variant of the Llama 2 architecture, developed by dariolopez. This model was created by following a tutorial for fine-tuning Llama 2, indicating it is likely a 7 billion parameter model. Its primary use case is for experimentation and learning about the fine-tuning process of large language models, rather than production-ready applications.

Loading preview...