andrewprayle/llama-2-7b-miniguanaco
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kArchitecture:Transformer0.0K Cold
The andrewprayle/llama-2-7b-miniguanaco model is a 7 billion parameter fine-tuned version of the Llama 2 architecture, specifically based on NousResearch/llama-2-7b-chat-hf. This model was fine-tuned using the mlabonne/guanaco-llama2-1k dataset, primarily as a learning exercise by its creator, Andrew Prayle. It is intended for internal use to explore the automation of medical literature searching within CochraneCF.
Loading preview...