futurehouse/ether0
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Jun 4, 2025License:apache-2.0Architecture:Transformer0.1K Open Weights Cold

futurehouse/ether0 is a 24 billion parameter language model developed by FutureHouse, fine-tuned from Mistral-Small-24B-Instruct-2501. It is specifically trained to reason in English and output molecular structures as SMILES, excelling in tasks like IUPAC name to SMILES conversion, molecular property modification, and retrosynthesis. This model is optimized for chemistry-specific reasoning and molecular generation rather than general chat. It features a 32K context length and is designed for precise chemical problem-solving.

Loading preview...