FelixChao/Capricorn-7B

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 14, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

FelixChao/Capricorn-7B is a 7 billion parameter language model. The model card indicates that further information is needed regarding its specific architecture, training, and intended use cases. As such, its primary differentiators and optimal applications are currently undefined.

Loading preview...

Model Overview

FelixChao/Capricorn-7B is a 7 billion parameter language model. The provided model card serves as a base template, indicating that detailed information regarding its development, funding, specific model type, and language support is currently pending.

Key Characteristics

  • Parameter Count: 7 billion parameters.
  • Context Length: 4096 tokens.

Current Status

As per the model card, specific details on the following are marked as "More Information Needed":

  • Model Description (developer, funding, type, language, license, finetuned from)
  • Model Sources (repository, paper, demo)
  • Intended Uses (direct use, downstream use, out-of-scope use)
  • Bias, Risks, and Limitations
  • Training Details (data, procedure, hyperparameters, speeds, sizes, times)
  • Evaluation (testing data, factors, metrics, results)
  • Technical Specifications (architecture, objective, compute infrastructure)

Recommendations

Users are advised that further information is required to understand the model's full capabilities, limitations, and appropriate use cases. It is recommended to await updates to the model card for comprehensive guidance on its application and potential risks.