Since the release of Open AI’s chatbot, ChatGPT, in late 2022, professionals have been attempting to predict the future effects of generative artificial intelligence (Gen AI) on the workforce. GenAI has already demonstrated itself to be capable of writing essays, code, and automatic translations, dramatically transforming even the most basic office tasks.
Within the context of sales enablement, GenAI has already become pervasive, writing sales emails, developing sales collateral, improving customer interactions, and generating insights from sales data.
Alongside these developments, immense changes are being made to the field of presentation software, as many companies are developing AI-driven technology that can automatically generate business presentations based on a company’s data.
Currently, many companies choose to save their slides within a central library repository, allowing sales to quickly access approved slides and avoid creating presentations from scratch.
As GenAI tools emerge that make it easier to generate slides quickly, it grows important to understand the impact on slide libraries and the need to reuse PowerPoint assets.
This article will cover an introduction into Microsoft Copilot and its predicted impact on how slide presentations are created, examples of other types of AI presentation software, and an initial assessment of the benefits and risks inherent in such technology.
Microsoft 365 Copilot
What is Copilot?
At the forefront of many discussions of AI’s integration into the workplace is Microsoft 365 Copilot – a GenAI based tool which, upon its release, will be embedded into the many Microsoft applications such as Word, PowerPoint, and Excel.
Copilot will include a chat interface, through which users can enter commands prompting it to generate content based upon the information stored within a company’s files, emails, and calendars (9). Copilot will also be able to suggest edits for existing content to improve readability and accuracy.

Copilot and PowerPoint
Copilot will likely make it possible for employees to create slides presentations much more quickly and easily than in the past. In regards to its PowerPoint application, Copilot will be able to generate presentations from proposals or documents, write speaker notes, and tweak specific slides if needed (6).
It can also automatically input images saved through Microsoft’s OneDrive that pertain to the presentation’s subject, and insert slide transitions or animations (1). Most importantly, Copilot will be able to use the data stored within a company’s Microsoft account in order to raise and support claims within its presentations (9).
Information stored within an Excel spreadsheet, for example, may be drawn upon by Copilot to provide the evidence behind a sales pitch.
How does it work?
In essence, Microsoft Copilot is a much more sophisticated version of more prevalent AI such as ChatGPT, as it can gather information from a company’s data platforms in real time, rather than depending on the initial set on which it was trained.
To better understand how it works, the Copilot stack can be categorized into four layers: the infrastructure, foundation models, orchestration, and user experience. Copilot’s infrastructure, as well as that of ChatGPT, is based off of Microsoft Azure’s supercomputing abilities, and provides the frame by which the foundation models can run (8).
The foundation models themselves are capable of performing many different types of complex tasks, such as generating content – as a point of comparison, foundation models form the core of ChatGPT (4). These models have access to large data sets, which they use to provide a relevant response to the prompt it is asked.
If the foundation models do not have access to enough data, this may cause them to hallucinate and provide incorrect claims. The next two layers – orchestration and user experience – in turn analyze prompts to determine how they can best be responded to, and make up the viewable interface (8).
Other AI Presentation Software
In recent years, a variety of different GenAI-driven tools for creating and enhancing presentations have been launched, many of which are currently available for use online. Most of these products include some of the features boasted by Copilot, although with different levels of integration into Microsoft 365 applications.
None of them, currently, have access to the complete set of data stored in a user’s Microsoft account. For this reason, it is easier not to consider these applications as alternatives to Copilot, but rather as more specialized tools, each with a more specific purpose and use.
Examples
One example of popular AI-driven presentation software is Dektopus, which is able to generate presentations based on prompted information entered about the desired topic, audience, and purpose for each slide deck. Dektopus is, furthermore, capable of inserting speaker notes into the presentation, and suggesting stock images and media that can be inserted.
Another example is Sendsteps AI, which is a PowerPoint add in that can create presentations based off of an imported document, as well as respond to a prompt. It is capable of writing presentations in multiple languages, and creating interactives, such as quizzes, to keep the audience engaged.
In addition to the numerous PowerPoint based programs, similar software exists that can be used as an add-in to Google Slides. Some examples of this are SlidesAI, which converts text into a presentation, and MagicSlides, which can select images to match the created text. Furthermore, several applications exist with more specific purposes, such as DeckRobot, which uses AI to automatically format presentations to make them look visually appealing, but is incapable of writing them.
Why is this important?
With the use of these AI tools, the initial stages of writing and designing PowerPoints will become considerably easier for human employees, who will no longer have to devote as much of their attention to the more frustrating aspects of creating presentations such as format, design, and wording. However, these AI applications (except for CoPilot) will not have access to a company’s information, and so humans may need to insert data themselves that can support the point of the presentation and will need to fact check any information generated by the AI. Furthermore, even for AI-driven tools with access to a company’s data set, there are elements of risk involved in their usage which balance their positive capabilities.
Risks of Using AI to Generate Presentation
Prediction
It is likely that AI will make it considerably quicker and easier for humans to generate business presentations. However, there is a risk that these slides may be lacking in veracity or quality in general, at least in the immediate future.
There are numerous factors which indicate this, such as the current performance of GenAI such as ChatGPT, as is seen in its creative output and proclivity to hallucinate, and the tendency of humans to implicitly trust decisions and information generated by AI.
What are AI hallucinations?
When Copilot, or similar AI software, receives a prompt and categorizes it into a query, the large language models access the data they have on that topic and attempt to respond accordingly. Because of this, if this data set is inaccurate, the AI analysis will be similarly incorrect and potentially misleading (2).
This means that if there is information within a company’s files stored in Microsoft that is outdated or otherwise incorrect, Copilot may use that data when generating a presentation, believing its analysis to be completely factual.
Even more concerningly, AI such as ChatGPT also often generates new text that is completely incorrect or makes little sense (10). This is referred to as a hallucination, and usually happens when the AI is asked for information not clearly available within its data set. When pushed, the AI will confidently state that its assertions are true, and on occasion will make up fake sources which it claims are evidence (10).
Oftentimes, these hallucinations are based on the biases it detects on the internet, such as hate speech or political opinions (9). ChatGPT also occasionally will leap to unexpected conclusions that have little to do with the given prompt and no factual basis (3).
Human response to AI decisions
A series of experiments from Harvard University and the University of California Berkeley found that contrary to popular belief, humans in general “adhere more to advice when they think it comes from an algorithm than from a person” (5), as they find machines more likely to be reliable than human judgment.
In the study, participants were asked to estimate a value based off of a provided data set (for example guessing the weight of a person in a photograph), and then evaluated on how greatly they would amend their guesses based upon human or algorithmic advice. What researchers in the study found is that participants changed their estimates the most to reflect the advice of an algorithm over that of a human or even their own opinions.
These findings suggest that humans are likely to trust AI decisions, and thus might not check information and analysis generated by an AI as diligently as they would review a human coworker’s findings or even their own evaluations. This could be problematic in situations wherein an AI hallucinates or has access to a faulty data set.
Application to AI
In summary, if an AI tool, such as Copilot, that is linked to a company’s files has access to outdated or incomplete data, it may hallucinate and generate information within a slides presentation that is faulty or incorrect, stating that this information is true.
Human employees utilizing the presentation may hesitate to thoroughly fact check the AI generated content, given the human propensity towards trusting the judgment of AI, and thus misinformation could spread easily within such presentations.
Furthermore, just as humans may not check AI generated content for correctness, they may also struggle to recognize inconsistencies in the type of language and branding used within presentations, which are critical for a company’s image and potentially more difficult for AI to create. Thus, companies may wish to have human employees work to ensure that such consistency exists within AI created presentations as well.
Conclusion
GenAI will undeniably change how presentations are generated and maintained, with long reaching consequences that may vastly redefine traditional office norms. It is likely that AI presentation software will make it much easier and more efficient for workers to generate presentations, however there are potential risks involved with utilizing this technology.
Most notably, AI has a tendency to hallucinate, and generate untrue or nonsensical information, based on small flaws or biases detected within its data set. Furthermore, based on scientific evidence demonstrating algorithmic appreciation, humans will be likely to trust the information produced by AI, in many cases over their own better judgment.
Within the world of sales enablement and PowerPoint presentations, one way to help mitigate these risks is to ensure that users have access to human-verified content. When creating presentations, rather than always building slides from scratch using AI, employees could instead use a slide library to more easily save and reuse accurate slides within an organization.
TeamSlide
TeamSlide is a slide library and slide search tool that helps marketing leaders ensure sales teams can find their best slides. It connects to your content repository, keeping it as your single-source-of-truth and allowing you to have easy access to your best content. Learn more →
Sources
- Abril, Danielle. “You May Soon Be Able to Use AI in Microsoft Word, Outlook.” The Washington Post, 17 Mar. 2023, www.washingtonpost.com/technology/2023/03/16/microsoft-office-ai-copilot/.
- Baxter, Kathleen, and Yoav Schlesinger. “Managing the Risks of Generative AI.” Harvard Business Review, Harvard University, 6 June 2023, hbr.org/2023/06/managing-the-risks-of-generative-ai.
- Fowler, Geoffrey A. “Perspective | CHATGPT Can Ace Logic Tests Now. but Don’t Ask It to Be Creative.” The Washington Post, 14 Apr. 2023, www.washingtonpost.com/technology/2023/03/18/gpt4-review/.
- Friedland, Alex. “What Are Generative Ai, Large Language Models, and Foundation Models?” Center for Security and Emerging Technology, Georgetown University, 21 July 2023, cset.georgetown.edu/article/what-are-generative-ai-large-language-models-and-foundation-models.
- Logg, Jennifer, et al. “Algorithm Appreciation: People Prefer Algorithmic To Human Judgment.” Organizational Behavior and Human Decision Processes, no. 151, 24 Apr. 2018, pp. 90–103, doi:https://doi.org/10.1016/j.obhdp.2018.12.005.
- Marks, Gene. “On AI: Microsoft Copilot Is Going to Be Huge. Here Are 6 Critical Things Every Business Owner Should Know.” Forbes, Forbes Magazine, 14 July 2023, www.forbes.com/sites/quickerbettertech/2023/07/12/on-ai-microsoft-copilot-is-going-to-be-huge-here-are-6-critical-things-every-business-owner-should-know/?sh=2bca8b192ab9.
- Metz, Cade. “What Makes A.I. Chatbots Go Wrong?” The New York Times, 29 Mar. 2023, www.nytimes.com/2023/03/29/technology/ai-chatbots-hallucinations.html.
- MSV, Janakiram. “Inside Microsoft Copilot: A Look at the Technology Stack.” Forbes, Forbes Magazine, 30 May 2023, www.forbes.com/sites/janakirammsv/2023/05/26/inside-microsoft-copilot-a-look-at-the-technology-stack/?sh=b6be8205b590.
- Spataro, Jared. “Introducing Microsoft 365 Copilot – Your Copilot for Work.” The Official Microsoft Blog, Microsoft, 16 May 2023, blogs.microsoft.com/blog/2023/03/16/introducing-microsoft-365-copilot-your-copilot-for-work/. Accessed 25 July 2023.
- Weise, Karen, and Cade Metz. “When A.I. Chatbots Hallucinate.” The New York Times, 1 May 2023, www.nytimes.com/2023/05/01/business/ai-chatbots-hallucination.html.