Prompt Engineering: AI prompts that punch!

What is an AI prompt?

Generative AI tools like ChatGPT, Google Bard and Microsoft Copilot in are becoming increasingly popular among developers, content creators and, now almost any and everyone with the release of Copilot Edge (formerly Bing Enterprise Chat) and of course, Copilot for Microsoft 365.

I often hear “of ChatGPT is better than XYZ or Copilot is better than ABC. The fact is, whilst these tools can yield incredible results, getting started can be challenging, and getting the prompt to do exactly what you “had in mind” takes practice – especially for those who are new to generative AI.

In this blog post, I provide some tips on how to work with generative AI tools like ChatGPT and Copilot, including how to write, and perfect, good AI prompts. Prompts are essentially instructions that are used to tell/ask the AI what you’d like it to do…

Understanding how Generative AI works

Generative AI chatbots use complex language models (LLMs), machine learning algorithms, internet data and organisational data (in the case of Microsoft Copilot) to generate text, create, summarise, rewrite or transform content, write code, generate images and even help people build low code workflows or model driven apps in Power Platform. These GenAI tools do this based on user input and context, known as “prompts”.

Whilst these tools are incredibly smart (having been trained on a decade of data, images, writing styles and even the works of Shakespeare, the results are not perfect and can sometimes generate inaccurate or irrelevant content, known as hallucinations.

These hallucinations are usually caused by a lack of understanding of the ask from the user, conflicting requests or poor data upon which they base their response. Remember these tools can access your company data (under the context of the user) and the web.

Writing good AI prompts – the ingredients

To get the best results from generative AI tools like ChatGPT and Copilot, it’s essential to write good AI prompts. Here are some AI 101 tips on what good AI prompts look like:

  • Be specific with the ask: Make sure you are being clear about what you want the AI model to do. The more specific you are, the better the AI model can understand your prompt and provide accurate results.
  • Avoid using ambiguous language: If what you ask could be interpreted in different ways, you may not get the result you hoped for. Be clear and concise in your prompt to avoid any confusion.
  • Provide context: this is crucial to ensure that the AI model understands the intent behind your prompt. The more context you provide, the better the AI model can understand your prompt.
  • Use simple language: that is easy for the AI model to understand. Avoid using complex words or phrases that the AI model may not be familiar with. Slang words are generally OK but take your time to read the prompt back to make sure it makes sense.
  • Take advantage of turns: A turn is essentially your response to the AI’s answer. You can use this to either rephrase your ask or to fine tune the response and is a good alternative to trying to write long complex prompts in one go.
  • Make it a conversation: Building on the above, think of how you might ask a human to help you with a task. You can use the “turns” to perfect the prompt and even ask the AI why it gave a particular answer or to explain something you don’t understand. This may feel unnatural at first but soon it’s just IM’ing a friend or co-worker.

Good and bad prompt examples

Here are some examples of good and bad AI prompts. Try these and see how you get along.

I’ve included a video which walks though these and shows the differneces in the results based on the prompts we gave. You’ll see we can be quite specific in what we want. The video also showcases how we can “perfect” our answers through additonal turns.

Goal: Create a product update for the Flux Capacitor 2

Bad prompt: “Create me a product update for the Flux Capacitor version 2”.

Good prompt: “Create me a product update for the Flux Capacitor version 2. This is a fictional product, based on the original flux capacitor used in the film Back to the Future 2. Make up some new improvements that the Flux Capacitor V2 could have over the first version. Be creative with improvements.”

Write a product brief in Copilot.

Explanation – the second prompt is better because we have firstly provided context of the ask (this is a fictional product), been specific with the ask (we have told it what we expect).

Goal: Create a Lego avatar of yourself

For this, I am using Microsoft Designer Image creator.

Bad prompt: A picture of a person with brown hair as a Lego man.

Good prompts: A person with short dark brown hair, wearing a tuxedo and holding a glass of champagne sitting on a chair outside a large country house on a cold dusk evening in the summer. Lego Style, illustration, 3d rendered.

Explanation: The second prompt is better (well depending on what we want) because again we have given specific asks about what we want, been specific with the ask, provided some context about what we want to produce and described the image we want.

Goal: Write a story about a dog called Benji

Bad prompt: Write a story about a dog named Benji

Good prompt: Write a story about a dog name Benji. Benji a small puppy and lives a family with four people including two young children called Jack and Jill. Benji is a lazy dog but discovers a passion for going for walks to train stations and barking at trains. Creative Style writing.

Explanation – the second prompt is better because we have firstly provided context of the story we would like and have also given a background to the story. We have guided the AI to how we’d like the story to flow and then left it to the AI to write. We have also specified a mode we want it in “be creative”. We can use another “turn” to make the story shorter or to write a catchy title for the story.

Goal: Extract key information from a document

Note: For this example, I am using a document here: Energy Consumption in the UK 2023 (publishing.service.gov.uk). I have opened this page in Microsoft Edge and am using Copilot in Edge to ask about the document.

Bad Prompt: Tell me about this document.

Good Prompt: Read this document and create a table that shows the main energy usage across different key areas in order of highest to lowest. Also provide a short commentary after the table that describes more about these areas and whether these are increasing over time or reducing.

Using Copilot in Edge to discuss and extract data from a document.

Explanation: the first prompt simply creates a summary of the document. This is useful (try it), but we haven’t told it what we actually want to see (which might be fine) and usually we have a specific thing we are looking for when we analyse a document. The second prompt is much more specific. It gives the AI clear direction (specific ask) about what we want and how we want the data presented.

Perfecting your “Prompts”

Writing good AI prompts is just the first step in working with generative AI tools like ChatGPT and Copilot. To get the best results, you need to perfect your prompts over time and practice. Don’t think of it as a chore. Enjoy it as you learn… You’ll soon become a pro.

Here’s my tips on how to perfect your AI prompts:

  • Timing: I find it best to think of a task you need to perform and use a real example to see if you can get what you need. As an example, if I’m doing a customer demo on AI, I tend to use an example relevant to organisation I am working with and make the request about them (or make up a scenario specific to them).
  • Experiment: Don’t be afraid to experiment with different prompts and see what works best. Try different variations of your prompts and see which ones generate the best results.
  • Adapt: Generative AI tools like ChatGPT and Copilot are constantly evolving and improving, so it’s essential to adapt your prompts to keep up with the latest changes. This also means the result you get from the same prompt may change a week or month later. The data it’s referencing may also change.
  • Enjoy the learning experience: Working with generative AI tools can be challenging, but it can also be a lot of fun. Enjoy the learning experience and don’t be afraid to try new things.
  • Use image creation as a fun way to learn whilst text-based requests are usually caused hat we need to do, practicing on image creation using something like Microsoft Designer is great fun and people tend to share their prompts on social media… Here is an example of one I shared.
Bing Image Creator in Designer.

Conclusion

Working with generative AI tools like ChatGPT and Copilot can be challenging, but it can also be rewarding. By following the tips outlined in this blog post, you can write and perfect good AI prompts that generate accurate and useful results. Remember to experiment, adapt, and enjoy the learning experience.

With practice, anyone can become proficient in working with generative AI tools.

The video I have included hopefully provides more context – feel free to follow along in Copilot in Edge or ChatGPT

Microsoft 365 Copilot – what makes good AI “prompts”

Microsoft 365 Copilot was released to GA today with a minimum price tag of three hundred licenses at $30 US dollars per user per month [around $108k minimum].

My last blog covered the potential ROI of using Gen AI tools like Microsoft 365 Copilot, but it’s also worth remembering that Copilot also exists (for free) today inside Bing Chat and Windows 11 (if you are running the latest 22H2 or 23H1 release rings).

Organisations looking to move quickly and get onboard with Copilot have work to do to get their data in shape, educate and train users and find and test the use cases within their organisations to determine if and where Copilot will add most value.

Once deployed (and this goes with any Gen AI tool to be honest), the areas your adoption specialists, training and AI success units will be wanting to be focussing on with employees is how to get Copilot to do what you ask in the most efficient way. We call this “prompting”. This blog introduces the concept, shares some tips, and tricks we (Cisilion), have picked up on the way.

The way we interface with Generative AI is very different to the way we use search engines (which are typically based on key word searches). Generative AI has the ability to really understand what you are asking for and how you want the information you ask for presented. It takes a bit of time to get used to and refine and the more you use it, then better the output and the easier and faster you get to your end result.

The Perfect AI Prompt?

Prompts are how you ask your Copilot (whether Microsoft 365, Windows, or Bing) to do something for you. This could be creating, summarising, comparing, editing, or transforming content. Prompts are “conversations”, using plain but clear language and providing the relevant information, background, ask and context of the request – just like you would if you were asking a human assistant.

Writing good prompts is the key to unlocking the power and potential of generative AI tools like Microsoft 365 Copilot.

Microsoft.

In short a prompt has three parts.

  • Telling Copilot what you want – for example creating, editing, summarising etc.
  • Including the right prompt ingredients – for example what you need and why.
  • Keeping the conversation going to fine tune your request and get the content you need.

Telling Copilot what you need

This may sound obvious, but we often find many people do not appreciate or understand just how particular and precise you can be with these tools. When we run workshops, I often ask the audience to use Bing Chat to create an output with the minimum number of prompts. What i typically see is people “talk” to AI like they talk to their smart speaker, typically asking a simple open question about the weather, train times, or a fact [or in my case my kids ask it for a rude joke or a silly song…or worse].

Working with Generative AI should be seen as similar to working with a person. As such, the more ambiguous the request, tone and language is, the more likely it is that the response you get from Copilot won’t be what you need or expected.

For example, a prompt such as “please analyse this spreadsheet of customer spend and provide insights into the most frequently bought products and services our customer buy for a meeting I have with the leadership team about product and service performance will give Copilot a lot more content and context about what you need work to do with that please analyze this dataset and summarize the results. A prompt that simply asks “summarise this information for me” – clearly misses the conext and framing of what the information is required for.

Include the right prompt “ingredients”

In order to the get the best response from your prompts, it is also important to focus on some of the key elements that will impact the type of response you get from Copilot. In short this is about setting the right goal and the right context along with which data source of information you want to use and you expectations of the output.

  • The Goal refers to what response you want to get from Copilot
  • The Context refers to why you need it and who or what is involved
  • The Source refers to which information source(s) or examples Copilot should you
  • The Expectations refer to how you want Copilot to respond to your request.

Here’s how that fits together into a “good prompt”…

Keeping the conversation going

Since Copilot uses the concepts of turns with regards the prompts you use, you can tweak, fine tune or ask further questions based on the information generated and information you feed it. Whilst Copilot will not learn from your data, it keeps the conversation active until you finish meaning you can refine your requests. This helps you collaborate with Copilot like you would a person. You ask for more information, to present data in a different way or simple change the language or tone of the response.

Examples based on the above could include:

In short – When creating a prompt, think of it as if you were talking to a helpful colleague – there no need to worry about the order, formatting, or structure – the goal is to keep it conversational.

General Do’s and Don’ts

Finally, there are some wider tips and guidance to help ensure you get the best from these conversational input methods. In short, the do’s a don’ts can, be summarised as.

Do’sDon’ts
Be clear and specific with your ask. tell it how you want the response or output generated. A draft, bullet points, in Word or in PowerPoint for example.Be vague or ambiguous. Use concise and unambiguous language. If you want something in a certain way – tell it what you want.
Give examples to help Copilot do what you want. If there is a previous document or table you want, state it. If you want something in a certain style, ask. Use slang words, jargon, or informal language. The Lange models Copilot uses are well trained but may miss interpret acronyms, slang words and jargon and therefore give random results.
Provide details that help Copilot do what you ask. Give as much background to what you are asking as possible – just like you would to a human assistant. Set the context and ask clearly. Give conflicting information or ask Copilot to compare or contrast unrelated data or something that is a bad example of what you need. Keep the responses clear and concise and use additional prompts to refine if necessary.
Use turns (these are additional prompts) to tweak and refine your response. If you don’t like something or want something expanded or changed – simple, ask. Change topics without starting over. The best way to end a conversation and start over is to either write “new task” or click the new conversation button.
Feedback to IT. Copilot is only as good as the data and information it has access to. If you are not getting the right response, it may be because you don’t have access to the right data or that the data is out of wrong. Check the data source Copilot refers you to with IT or the document owner. Take what Copilot produces as fact without checking first. Copilot is only as good as the data and information it has access to. If you are not getting the right response, it may be because you don’t have access to the right data or that the data is out of wrong. Check the data source Copilot refers you to with IT or the document owner.
Examples of Good and Bad AI Prompts