Learning Prompting

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 5

Learn Prompting - https://learnprompting.

org/
Basics
- Introduction:
- Q: What is artificial intelligence (AI)?
- A: A field in which people have created "smart" algorithms that "think" like humans.

- Q: What are some industries that AI is currently revolutionizing?

1. A: Manufacturing: General Electric (GE) uses AI to optimize production processes in


its factories, resulting in a 20% reduction in unplanned downtime.
2. Finance: JPMorgan Chase uses AI to detect fraud in credit card transactions,
resulting in a 20% reduction in false positives.y - 20
3. Transportation: Tesla uses AI to develop self-driving cars, which are designed to
reduce accidents and improve traffic flow.
4. Retail: Amazon uses AI to personalize customer experiences on its e-commerce
platform, recommending products based on past purchases and browsing history.
5. Marketing and advertising: Coca-Cola uses AI to analyse consumer data and create
personalized marketing campaigns, resulting in a 32% increase in sales.
6. Energy: Duke Energy uses AI to optimize its power grid, reducing outages and
improving energy efficiency.
7. Agriculture: John Deere uses AI to monitor crop health, improving yields and
reducing water usage by up to 20%.

- Q: How can AI be used to improve efficiency and accuracy?


- A: By automating mundane tasks, making predictions and generating insights, and
identifying potential risks and opportunities.

- Q: What is one benefit of AI in finance?


- A: AI can detect patterns in the stock market and offer insights that would otherwise
go unnoticed.

- Q: Why is being able to properly direct AIs a powerful skill?


- A: Because AIs require humans to direct them on what to do and being able to do so
properly can lead to increased efficiency and accuracy.

- Q: What is the applied prompting section?


- A: A section that may be of interest if you would like to see how professionals use AI
to automate their work.

- Q: Do you need technical background to do prompt engineering?


- A: No, most of it is trial and error, and you can learn as you go.

- Prompting:
Q: What is prompting in AI?
A: The process of instructing an AI to do a task by telling it a set of instructions (the
prompt) and it performs the task.

Q: What is the focus of the field of Prompt Engineering?


A: Creating prompts that yield optimal results on a task.
- Giving Instructions:
- Q: What is the name of the prompting method that involves giving instructions?
- A: Instruction prompting
-
- Q: What is an example of a more complex problem that modern AIs can solve?
- A: Solving a sales email by removing personally identifiable information and
replacing it with appropriate placeholders.

- Q: What is the Turking Test?


- A: A test to determine if language models can understand instructions.

- Q: What is a promising approach for allowing AI to remove PII from text?


- A: Reframing instructional prompts to GPTk's language.
- Role Prompting:
- Q: What is role prompting?
- A: Assigning a role to the AI to give it context and help it understand the question
better.

- Q: How does role prompting help the AI give better answers?


- A: It gives the AI context and a better understanding of the question
- Few shot Prompting:
- Q: What is few shot prompting?
- A: Few shot prompting is a strategy where the model is shown a few examples of
what is required and then asked to perform the task.

- Q: What is the key use case for few shot prompting?


- A: Few shot prompting is useful when you need the output to be structured in a
specific way that is difficult to describe to the model.

- Q: What is the difference between 0-shot, 1-shot, and few-shot prompting?


- A: 0-shot prompting involves showing the model a prompt without examples, 1-shot
prompting involves showing the model a single example, and few-shot prompting
involves showing the model 2 or more examples.

- Q: Why is few-shot prompting preferred over 0-shot and 1-shot prompting in most cases?
- A: Usually, the more examples you show the model, the better the output will be, so
few-shot prompting is preferred over 0-shot and 1-shot prompting in most cases.

- Formalising Prompts:
- Q: What are the different parts of a prompt that you will see over and over again?
- A: A role, an instruction/task, a question, and context.

- Q: What is a "standard" prompt according to Kojima et al.?


- A: Prompts that consist solely of a question.

- Q: What are exemplars in few shot standard prompts?


- A: Examples of the task that the prompt is trying to solve, which are included in the
prompt itself.
- Q: What is the advantage of having the instruction as the last part of a prompt?
- A: The LLM is less likely to simply write more context instead of following the
instruction.

- Q: Why is it important to define a "standard" prompt?


- A: To discuss new types of prompts in contrast to standard prompts.

- Q: What is the title of the article by Liu et al.?


- A: "Pre-train, Prompt, and Predict: A Systematic Survey of Prompting Methods in
Natural Language Processing"

- Q: What is the main finding of the article by Kojima et al.?


- A: Large language models are zero-shot reasoners.

- Q: What is the title of the article by Brown et al.?


- A: "Language Models are Few-Shot Learners".
- Chatbot basics:
- Q: What is the difference between GPT-3 and chatbots like ChatGPT?
- A: Chatbots like ChatGPT can remember your conversation history, while GPT-3 has
no memory.

- Q: What is style guidance in chatbots?


- A: Style guidance is asking the AI to speak in a certain style, such as a friendly or
informal tone.

- Q: How can adding descriptors to a prompt change the chatbot's response?


- A: Adding descriptors such as "Funny", "Curt", "Unfriendly", "Academic Syntax",
etc. can change how the chatbot interprets or responds to the message.

- Q: How can the form of the first prompt affect the remainder of the conversation in a
chatbot?
- A: The form of the first prompt can affect the remainder of the conversation, allowing
for an additional level of structure and specification.

- Q: What is the purpose of using style guidance, descriptors, and priming in chatbots?
- A: Using style guidance, descriptors, and priming can help to better utilize chatbots
and increase the quality of responses.

- Q: What does "critique" mean in the context of analysing a text?


- A: "Critique" means to analyse the given text and provide feedback.

- Q: What does "summarise" mean in the context of a text?


- A: "Summarise" means to provide key details from a text.

- Q: What does "respond" mean in the context of a text?


- A: "Respond" means to answer a question from the given perspective.
- Pitfalls of LLMs:
- Q: What is one pitfall of LLMs when it comes to citing sources?
- A: LLMs cannot accurately cite sources because they do not have access to the
internet and do not remember where their information came from.
- Q: What is one potential issue with using LLMs in consumer-facing applications or
research?
- A: LLMs are often biased towards generating stereotypical responses, which can
lead to sexist, racist, or homophobic content.

- Q: What is one area where LLMs often struggle?


- A: LLMs are often bad at math and have difficulty solving simple and complex math
problems.
- LLM Settings:
- Q: What is temperature in the context of language models?
- A: Temperature is a configuration hyperparameter that controls the randomness of
language model output.

- Q: How does a high temperature affect the output of a language model?


- A: A high temperature produces more unpredictable and creative results.

- Q: What is top p in the context of language models?


- A: Top p, also known as nucleus sampling, is another configuration hyperparameter
that controls the randomness of language model output.

- Q: How does a low top p value affect the output of a language model?
- A: A low top p value can produce more conservative and predictable results.

- Q: When should a low temperature or top p value be used in language models?


- A: For tasks where accuracy is important, such as translation tasks or question
answering, a low temperature or top p value should be used to improve accuracy and
factual correctness.
- Understanding AI minds:
- Q: What are discriminative AIs?
- A: AIs that classify things.

- Q: What are generative AIs?


- A: AIs that make things.

- Q: Which AIs are mainly used in this course?


- A: GPT-3 and ChatGPT.

- Q: How do these AIs understand sentences?


- A: By breaking them into words/sub words called tokens. Each token they write is
based on the previous tokens they have seen and written; every time they write a
new token, they pause to think about what the next token should be.

- Q: Are the words "think", "brain", and "neuron" accurate descriptions of what these AIs are
doing?
- A: No, they are zoomorphism’s or metaphors for what the model is actually doing.
These models are not really thinking, they are just math functions. They are not
actually brains; they are just artificial neural networks. They are not actually
biological neurons; they are just numbers.
- Starting your journey:
- Q: What is the first step in solving an arbitrary prompt engineering problem?
- A: Researching the prompt you want.

- Q: What are some resources that can be helpful when researching a prompt?
- A: Awesome ChatGPT Prompts, FlowGPT, r/PromptDesign, Learn Prompting
Discord, and other relevant Discords.

- Q: What is the purpose of experimenting with a prompt in ChatGPT?


- A: To see how the prompt generates a story and make modifications to improve it.

- Q: What modification was made to the original prompt to output the story as plain text?
- A: The prompt was modified to remove the csharp code block.

- Q: What additional structure was added to the prompt to improve its detail?
- A: The prompt was modified to include flowery language and descriptive words, as
well as a specific setting and format for responses.

- Q: What is the best way to learn how to write good prompts?


- A: Trial and error

- Q: Are there any strict rules for writing the best prompts?
- A: No, there is no gold standard for how to write the best prompts.

You might also like