xinyifang/ArxivLlama
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 28, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

ArxivLlama by xinyifang is an 8 billion parameter Llama model, fine-tuned from unsloth/meta-llama-3.1-8b-instruct-bnb-4bit, with a 32768 token context length. Developed for node classification on text-attributed graphs, it leverages a multi-profiling framework for data augmentation. This model excels at classifying scientific papers into arXiv categories, demonstrating strong performance on datasets like ogbn-arxiv and ogbn-products.

Loading preview...