Lk123/InfoSeek-7B-RFT
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Nov 10, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

InfoSeek-7B-RFT by Lk123 is a 7.6 billion parameter language model with a 32,768 token context length. This model is specifically fine-tuned for retrieval-augmented generation (RAG) tasks, excelling at information seeking and synthesizing relevant data from provided contexts. Its architecture is optimized for accuracy and coherence in generating responses based on external knowledge sources, making it suitable for applications requiring precise information extraction and summarization.

Loading preview...