AstroMLab/AstroSage-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Nov 14, 2024Architecture:Transformer0.0K Warm
AstroMLab/AstroSage-8B is an 8 billion parameter, domain-specialized natural language AI assistant based on Meta-Llama-3.1-8B, developed by AstroMLab. It is specifically tailored for research in astronomy, astrophysics, and cosmology, having been trained on a comprehensive collection of astronomy-related arXiv papers and millions of synthetic question-answer pairs. This model demonstrates proficiency in a wide range of astronomical questions, outperforming general-purpose models of similar size and achieving comparable performance to GPT-4o on specialized benchmarks.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
min_p
–