ibivibiv/bubo-bubo-13b
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Jan 24, 2024License:llama2Architecture:Transformer0.0K Open Weights Cold

ibivibiv/bubo-bubo-13b is a 13 billion parameter auto-regressive language model developed by ibivibiv, fine-tuned on the Llama 2 transformer architecture. This English-language model is specifically trained and optimized for summary tasks, particularly excelling at summarizing communication chains. It features a 4096-token context length and uses an Alpaca-style prompt template for instruction-response interactions.

Loading preview...