socratesft/socrates-qwen2.5-14b-sft
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Aug 31, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The socratesft/socrates-qwen2.5-14b-sft is a 14.8 billion parameter language model developed by socratesft, based on the Qwen2.5 architecture. This model has been specifically fine-tuned using Supervised Fine-Tuning (SFT) on the SocSci210 dataset. It is designed for simulating survey respondents and generating precise answers based on detailed demographic profiles and specific instructions, making it suitable for social science research and data generation tasks.

Loading preview...