Corianas/tiny-llama-miniguanaco-1.5T: Question Answering Model
This model, developed by Corianas, is a 1.1 billion parameter variant of TinyLlama, specifically fine-tuned for question-answering tasks. It is designed to process a prompt (question) and generate a direct answer, concluding the response with a <END> token on a new line.
Key Capabilities
- Direct Question Answering: Optimized to provide concise answers to user queries.
- Compact Size: With 1.1 billion parameters, it offers a relatively small footprint for deployment.
- Structured Output: Responses are formatted with a clear question-then-answer structure, ending with
<END>.
Usage and Formatting
The model expects a simple input format: "prompt"\n"completion"\n<END>. When generating text, the prompt should be followed by a newline to initiate the answer. The output will contain the answer, terminated by <END> on a new line.
Good For
- Applications requiring straightforward, factual answers to questions.
- Scenarios where a compact, efficient language model for question answering is beneficial.
- Integration into systems that can parse the
<END> token for response termination.