RomanNeobutov/PhysicsOnBooks
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:May 10, 2025License:otherArchitecture:Transformer0.0K Warm

RomanNeobutov/PhysicsOnBooks is a 1.5 billion parameter causal language model with a 131072 token context length. This model was trained using AutoTrain, indicating a focus on automated training methodologies. Its primary utility lies in applications requiring a compact model with an exceptionally long context window, suitable for processing extensive textual information.

Loading preview...