google/txgemma-27b-predict
TEXT GENERATIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Mar 21, 2025License:health-ai-developer-foundationsArchitecture:Transformer0.0K Gated Warm

TxGemma-27B-Predict is a 27 billion parameter open language model developed by Google, built upon the Gemma 2 architecture and fine-tuned for therapeutic development. It excels at processing and understanding information related to therapeutic modalities and targets, such as small molecules, proteins, and diseases. This model is optimized for property prediction tasks in drug discovery and can serve as a foundation for further specialized fine-tuning.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p