AngelRaychev/0.5B-value-iteration_1
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kLicense:apache-2.0Architecture:Transformer Open Weights Cold

The AngelRaychev/0.5B-value-iteration_1 is a 0.5 billion parameter language model, fine-tuned from AngelRaychev/0.5B-value-iteration_0. This model was trained for 50 epochs with a constant learning rate of 1e-06 and achieved a validation loss of 0.3933. Its specific application or primary differentiator is not detailed in the available information.

Loading preview...