Skip to main content
All CollectionsChatGPT
Prompt engineering best practices for ChatGPT
Prompt engineering best practices for ChatGPT

Learn how to craft effective prompts to get the best out of ChatGPT

Updated over 4 months ago

This guide provides essential techniques to help ChatGPT Enterprise users craft effective prompts for high-quality responses. You can fully harness ChatGPT's potential using structured prompts, refining answers iteratively, and relaying relevant context.

What is a prompt?

A prompt for a Large Language Model (LLM) is a text input that initiates a conversation or triggers a response from the model. However, it can be in other forms such as an image or audio.

What is prompt engineering?

Prompt engineering is the process of designing and optimizing input prompts to effectively guide a language model's responses.

How to craft effective prompts

Give ChatGPT a persona

Assigning a persona allows ChatGPT to answer from a particular role or perspective. This can help produce responses tailored to specific audiences or scenarios.

Example prompt

You are a data analyst for our marketing team. Provide a summary of last quarter's campaign performance, emphasizing metrics relevant to future campaign planning.

Add delimiters

Delimiters can help distinguish specific segments of text within a larger prompt. For example, they make it explicit for ChatGPT to understand what text needs to be translated, paraphrased, summarized, and so forth.

Example prompt

Translate the text delimited by triple quotes to French:

“””

Yes we will schedule the meeting next Friday and review your updates to the project plan. Please invite your contacts from the product team and be ready to share next steps.

“””

Provide step-by-step instructions

Step-by-step instructions, also known as chain of logic prompting or chain-of-thought prompting, is a technique used with large language models to improve their ability to solve complex problems, especially those requiring multiple steps or reasoning. This approach involves structuring the prompt to encourage the model to generate intermediate steps or reasoning processes that lead to the final answer, rather than attempting to jump directly to the answer.

Example prompt

You are given text delimited by triple quotes.

Step 1 - Read the text

Step 2 - Provide feedback on grammar and structure

Step 3 - Rewrite the text with recommended edits

Step 4 - Translate the text to French and to Spanish

Provide examples

This technique is called one or few-shot prompting (depending on the number of examples). By providing examples to the model, you can better provide context or illustrate the task at hand. With examples, the model can infer the pattern, style, or type of response expected.

Example prompt

Summarize the topic and mood of a piece of text

“””

A molecule, imagine this, is an astonishingly miniscule building block - so diminutive, it’s invisible! Yet, it’s the cornerstone of existence! These specks congregate to conjure, well, everything! Water, air, our very beings!

“””

Topic: Molecules

Mood: Amazement

“””

The new OpenAI Custom GPT sharing features released in March are awesome. I look forward to using these to customize who I share my GPT with!

“””

General best practices

Be clear and specific

Ensure your prompts are clear, specific, and provide enough context for the model to understand what you are asking. Avoid ambiguity and be as precise as possible to get accurate and relevant responses.

Iterative refinement

Prompt engineering often requires an iterative approach. Start with an initial prompt, review the response, and refine the prompt based on the output. Adjust the wording, add more context, or simplify the request as needed to improve the results.

Requesting a different tone

Use descriptive adjectives to indicate the tone. Words like formal, informal, friendly, professional, humorous, or serious can help guide the model. For instance, "Explain this in a friendly and engaging tone."

Did this answer your question?