LukeBailey181/goedel_prover_v2_8b_reviewer_finetuned_2048_num_samples
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 13, 2026Architecture:Transformer Warm

The LukeBailey181/goedel_prover_v2_8b_reviewer_finetuned_2048_num_samples is an 8 billion parameter language model with a 32768 token context length. This model is a fine-tuned version, though specific details on its architecture, training, and primary differentiators are not provided in its current model card. Its intended use cases and unique capabilities are currently unspecified.

Loading preview...