2023
PROMPT
ENGINEERING
Made by:
João A. Ribeiro
The Basics.
Prompt engineering is a novel discipline that has
emerged alongside the popularization of Large Language
Models (LLMs) such as ChatGPT by OpenAI.
Prompt engineering skills can help you to better
understand the capabilities and limitations of LLMs, and
most importantly, to obtain desired outputs from
prompts.
Given these new concepts, it's crucial to clarify that
Large Language Models (LLMs) are a subset of artificial
intelligence. This ability of generating contextually
relevant text based on the input they receive, places
LLMs within the category of Generative AI.
Researchers enhance LLM performance for various tasks
using prompt engineering. Developers design potent
prompts for seamless LLM interactions.
While we won't dive too deep here, this guide aims to
help you fathom prompt engineering basics.
Hopefully, after engaging with this content, you will
have improved outcomes in your interactions with this
remarkable new technology.
Main Elements.
In a nutshell, prompts typically include these
components:
Instruction: This guides the AI for a specific task.
Context: Extra information that helps elicit better
responses.
Input Data: Material used to generate output (and
more).
Output Indicator: Specifies the format for desired
results.
It's worth noting that a prompt doesn't always require
all four parts.
As you read the next sections, you'll gain a clearer
understanding of their importance and how they're put
into action.
Zero-Shot Prompting.
Zero-Shot Prompting involves giving an AI model a
prompt for a task it hasn't been directly trained on. Even
without specific training, the model utilizes its broad
knowledge to tackle the task. In essence, the model uses
its overall understanding to handle new tasks.
Example:
Prompt:
Classify the text into neutral, negative or
positive.
Text: I think drinking water is necessary
Sentiment:
Output:
Neutral
Notice that the LLM was able to understand “sentiment”
without being provided with previous examples -- that´s
the zero-shot capabilities at work.
Few-Shot Prompting.
The zero-shot capabilities are incredible, but they may
still fall short on more complex tasks. Few-shot
prompting is essentially showing examples right in the
prompt to help the model learn on the spot. It's giving
the model a quick training session before it starts
working.
Example:
Prompt:
A "whatpu" is a small, furry animal native to
Tanzania. An example of a sentence that uses
the word “whatpu” is: We were traveling in
Africa and we saw these very cute whatpus.
A "flibberflop" is a dance move that involves
twirling and hopping simultaneously. An example
of a sentence that uses the word flibberflop
is:
Output:
The party was so lively that everyone started
doing the flibberflop.
We can observe that the model has somehow learned
how to perform the task by providing it with just one
example (1-shot).
For more difficult tasks, we can experiment with
increasing the demonstrations (3-shot, 5-shot, 10-shot,
etc.).
Let´s try one more example:
Prompt:
This is awesome! // Positive
This is bad! // Negative
Wow that movie was rad! // Positive
What a horrible show! //
Output:
Negative
Take note that we've made a slight format adjustment
here. ChatGPT 3 (the LLM in use) adeptly comprehended
the user's intent, input format, and provided accurate
output.
Sentiment analysis is just one example. The possibilities
here are infinite.
The End.
My intention with this guide is to equip you with basic,
but effective, prompting techniques, such as zero-shot
and few-shot prompting.
Using what was presented in this guide should be enough
to assist you in your prompting journey, getting better
outputs and results.
I hope this guide was able to provide that for you.
Any questions or ideas on prompt engineering, artificial
inteligence or possibly understanding how DigitalBoost
can help your business using AI and automations, just
shoot me a message!
joaoribeiro@digitalboost.ai