Prompt a language model
The prompt for a large language model like LLAMA (Language Model for Language Model Analysis) typically refers to the initial text or set of instructions provided by a user to interact with the model and obtain a desired response. In the context of LLAMA or similar language models, the prompt serves as the input or query that the model processes to generate a meaningful and contextually relevant output.
Here's a breakdown of what a prompt might consist of:
-
Text Input The prompt usually begins with a text input or a statement that outlines the user's request or question. This text input can be a sentence, a paragraph, or a series of sentences.
-
Instructions In addition to the initial text input, prompts often contain explicit instructions to guide the model on how to respond. These instructions can specify the format of the response, request for certain information, or direct the model to follow a specific style or tone.
-
Context The prompt may include context that the model should consider when generating a response. This context could be related to a specific topic, scenario, or conversation history. Providing context helps ensure that the model's response is relevant and coherent.
-
Examples Some prompts may include examples or sample text to illustrate the desired output. These examples can be used to instruct the model on the expected style, content, or structure of the response.
-
Special Tokens In some cases, prompts may include special tokens or markers to indicate the beginning or end of certain sections, such as indicating where the user's input ends and where the model's response should begin. The markers depend on how a particular model was trained.
For instance, if a user wants to use LLAMA to generate a creative short story, their prompt might look like this:
Input: "Once upon a time in a magical forest,"
Instructions: "Please continue the story with vivid descriptions and a touch of whimsy. Make sure to introduce a memorable character and a unique plot twist."
The language model then processes this prompt, takes into account the user's input and instructions, and generates a story continuation based on the provided context and guidelines.
Prompt template​
The software uses a prompt template (see system settings) that allows to specify how the user prompt and the system prompt should be passed to the language model. For example the Llama 2 based models work well with the following sections:
[SYS]${prompt.system}[/SYS]
[INST] ${prompt.user} [/INST]
The variables are replaced by the input ${prompt.user}
and optionally the system prompt
${prompt.system}
.
The effectiveness of the prompt is crucial in obtaining the desired output from the language model, as it provides the necessary information and constraints for the model to generate relevant and coherent responses.
Answering E-Mails​
Here is a prompt that helps you understand how to instruct the model in automatically answering an E-Mail.
Please provide a concise response to the email inquiring about [Topic Section].
Summarize the key points, address any questions or concerns, and offer any necessary follow-up actions.