By Saul Carliner, Giuliana Cucinelli, and Samira Karim
On the one hand, it seems that ever since the launch of ChatGPT in November 2022, professionals in learning and development (L&D)—including instructional design—have tried to figure out how this technology might affect the work of instructional designers. After all, some early examples of generative AI (the class of artificial intelligence tool represented by ChatGPT) include entire courses and tutoring services prepared by the software. Some writers focus on using AI; others speculate about how AI might change our jobs.
On the other hand, despite the sudden appearance of ChatGPT, AI technologies have actually played an increasing role in the work of instructional design over the past decade and longer, and are likely to play an even more important one in the years ahead.
After providing a brief background definition of AI, we explore the two ways AI has affected the work of instructional designers both as a subject about which we write and as a work tool. When doing so, we also anticipate ways that AI might further address our work.
About AI
Although an in-depth explanation of AI falls well beyond the scope of this article, a short one might be beneficial, especially within the context of instructional design.
AI refers to the ability of machines to mimic the capabilities of humans. This includes the ability to:
- Replicate human actions (like touch and speak)
- Sense (such as visual and audio recognition)
- Perform human tasks with information (such as translation,, correcting spelling, and writing and reviewing content)
- Make greater inferences about the information (such characterizing audiences and predicting their learning needs)
- Solve problems and make decisions (such as assessing competence and job applicants)
AI as the subject of our work
Recently, one of the master’s students in our program took an internship at a company that produces applications that run on machine learning, in which systems adapt their behaviors and practices through the use of algorithms and inferences made from the data they receive. During the internship, the student prepared instructional materials that explained the logic underlying the program to sales representatives and product developers.
Sales representatives could use this knowledge and skills in their calls on customers and to answer customer questions about how the software worked and what made it unique. As their job title suggests, the product developers would use the knowledge and skills to prepare updates to the product and address problems that arose with it.
This highlights one of the ways AI affects instructional design: as the subject of the materials we prepare. At the least, aspects of AI have been increasingly integrated into the software addressed by these applications, including increasingly sensitive search capabilities, prediction capabilities (such as systems that can automatically assess credit risk, suggest adjustments of farming care to address different microclimates, and identify students at risk of failing), and voice capabilities (like chatbots that take voice input and respond in kind).
AI capabilities are also integrated into physical products. Some are new categories of products, like robots that can deliver orders to customers in restaurants and assist with resident care in long-term care facilities. Some are more traditional products, like sensors that monitor video cameras and can detect when something is amiss, and automobiles and cars that can take voice commands or automatically handle some driving tasks.
For these, instructional designers need familiarity with the type of AI used; knowledge of how learners should use these “intelligent” products and how service professionals and other staff should support and service such software and products; and the ability to explain all of this to the affected learners. That responsibility poses challenges, chiefly that AI allows for mass-customizing of product use—that is, different applications with different groups of users—and that many potential uses might not be anticipated by the product developers.
Communicating about these products could also affect the way instructional designers work. One way that could happen pertains to work processes, especially for those who work on physical products that incorporate AI. As Roy and White (2024) note, with the introduction of AI, the release schedules for many of these products will shift: Rather than the traditional schedule of six months to five years, continuous release of the software components may become the norm, much like mobile apps and other software, where upgrades are released as soon as they become available. Right now, sometimes when software is built into products, it is not updated.
A move to continuous release, in turn, would change the process of preparing products to an ongoing one, changing peak-and-valley experiences into something more akin to an assembly line. Some instructional designers will need to adjust to a different work rhythm.
Communicating about these products also increases the need for instructional designers to develop awareness of the limitations and liabilities of the software and products, which designers need to communicate to learners. For example, one issue raised by the software that monitors unusual activity by video cameras mentioned earlier is the possible need to disclose that the location is monitored. That, in turn, requires instructional designers to become familiar with new regulations on use of AI—some devised voluntarily with the collaboration of industry, such as those in the US; some devised by lawmakers, such as those coming into effect in the European Union—so designers can advise learners about legal and questionable uses of products. This is similar to how medical writers need to prepare disclaimers about the uses of medication and medical devices.
AI as part of our work processes and tools
At an advanced instructional design workshop that Saul led at Lakewood Media’s Training conference in San Diego in February 2011, instructional designers presented their recently produced tutorials for critique by their peers. One showed a tutorial with clear, but “tinny” narration; it sounded machine generated. The participant commented that they used a feature in Articulate Storyline that generated the narration from a script.
Later in the same workshop, another participant also showed a narrated tutorial. The second narration sounded significantly better than the first, and participants assumed the narrator was a professional. They assumed wrong; the second narration was generated by a newer version of Storyline. The technology had progressed significantly in a few short months.
In other words as far back as the early 2010s, AI had already affected the work of instructional designers from analysis of a project through design, development, implementation, and evaluation of the content.
Analysis: AI can summarize background readings and analyze data about audiences and make predictions about them. In addition, voice technologies can automatically transcribe interviews with subject matter experts. Some of these technologies have been in use for years (such as data analysis and transcription tools) and while others are newer (like summarizing tools).
For those who employ user experience practices, such as personas, scenarios, and user journeys, into their work, emerging practices with generative AI are finding ways to prepare these materials that help instructional designers empathize with their intended learners. This, in turn, helps compensate for the reality that many instructional designers have limited, if any, access to actual learners to aid them in developing such materials on their own.
Design and development. The impact of AI on these activities seems to have captured the most attention. Some AI-related technologies have been in use for years, including voice technologies that not only allow for system-generated narration in tutorials and show-me videos, but can also allow instructional designers to dictate their work, should they choose.
Other AI-related technologies have assisted instructional designers for years (often without our realizing that some sort of intelligence was involved), including tools that:
- Brainstorm design ideas: that is, proposed approaches to presenting material to learners
- Provide editorial assistance, starting with checking spelling and grammar and even advising on style
- Check conformance to templates and editorial and design guidelines so designers can fix any errors
- Check the accessibility of documents, noting features out of compliance with guidelines so designers can fix those issues
- Produce both text and voice chatbots
- Automatically translate text, significantly speeding up translation and facilitating simultaneous release of technical information in several languages, something not feasible in the past
- Produce and retouch images, from correcting photographs and illustrations to generating images and animations
- Publish materials on several platforms—print-like (PDF), computer and laptop screens, and mobile screens—from a single file; some systems automatically adjust the presentation to the device
- Offer guidance in the appearance of slides and other documents
- Transform materials for different media so that they can take advantage of the affordances of those media, such as converting a PDF file into an eLearning program
These tools have affected the everyday work of instructional designers by slowly but surely incorporating editorial and illustration responsibilities into the scope of work over the past several decades.
Generative AI could bring even greater changes, such as generating part—or all—of certain types of information, as it has been doing for at least a decade in professional communication. For example, the co-pilots (AI tools) in Microsoft Word and PowerPoint can assist or prepare first drafts, addressing blank screen syndrome; specialized apps can generate course syllabi (for designers in higher and continuing education) or entire online lessons by responding to just a few prompts.
More broadly, AI can handle routine writing tasks. Bloomberg has used system-generated headings since 2013 and system-generated earnings reports since 2018. Sports Illustrated and the news chain Gannett have used AI to generate sports summaries for several years (Tyrangiel, 2023). The types of information that seem ripe for this approach involve material that comes from existing sources (like product specifications or the Internet) and follows a particular format or template in presentation.
Other applications, like Sora and Synthesia (among others), can automatically generate video, just as some applications generate text. Instructional designers prompt the system, which, in turn, produces a video clip that the instructional designer can edit.
Implementation: AI also affects delivery of instructional content. The transition from classroom to online learning that started in the 1980s and gained strength in the 1990s and 2000s also saw a transition from fixed, linear courses to ones that learners could take at their own pace with lessons taken in any preferred order. Based on the tracking of learner performance in applications that is facilitated by standards like the Experience API (xAPI), intelligent tools will not only track what learners can perform but can also suggest which learning sequences might best address those learners’ needs.
Evaluation: AI can already assist with the tasks of creating assessment tools and rubrics. For example, tools like those provided by Contact North/Contact Nord support instructors in preparing multiple choice and open questions, as well as in developing rubrics to guide the marking of open questions.
Similarly, learning analytics programs are intended to identify students at risk based on their behavior in individual courses and across courses, so instructors and advisors can intervene. However, like many other applications of AI, students who might exhibit at-risk behaviors in the analytics might not actually be at risk of failing, suggesting the importance of human verification of such alerts. Similarly, some institutions try to intervene with these students through pre-programmed email messages, which only reinforce the anonymity of the learning experience when those students most likely need a human interaction to move past their challenges.
Beyond instructional design: AI’s effects on work experience
AI might affect the work of instructional designers in ways beyond their day-to-day work. AI can assist managers with hiring instructional designers, for example. Automated tracking systems help employers identify qualified applicants from their existing pools of applications (not just for a current opening but also past ones). “Scrubbers” that go through social media can also find potentially qualified applicants to recruit. Some organizations conduct screening interviews with automated tools that pose questions to applicants and makes recordings, which AI tools analyze to choose finalists.
In addition to affecting the work of instructional designers as described here, AI will increasingly require that instructional designers build familiarity with the subject matter of the courses on which they work. That seems paradoxical because AI can summarize source documents, speeding up advance research, and, later, generate text. But human instructional designers and their employers will remain liable for the content and, as has been demonstrated in some recent situations—such as that of a time-pressed attorney who submitted court briefs generated by ChatGPT that referred to non-existent precedents—AI tends to make up facts (called hallucination) (Cerulla, 2023). And as that attorney was sanctioned by the court, so instructional designers could find themselves liable for inaccurate information in their materials.
What this means to instructional designers
In the long-term, instructional designers will likely work with AI in one of the ways suggested by Perkins, Furze, Roe, and MacVaugh in their 2024 AI Assessment Scale for its impact on job tasks:
- Level 1: No AI. That is, some tasks will not be affected by AI. We anticipate the number of such tasks to be limited.
- Level 2: AI planning, in which AI supports instructional designers in conducting research and generating and developing ideas: tools that could ultimately affect the way instructional designers conduct analysis and design.
- Level 3: AI Collaboration, in which AI assists with specific job tasks like drafting materials and receiving feedback on the work. Much of this type of AI is already in use, such as tools to assist with turning PowerPoint slides into draft scripts, offering grammar and style advice, verifying compliance with accessibility requirements, and converting text into narration.
- Level 4: Full AI, in which AI performs all of the tasks of the job, such as applications that can generate entire asynchronous e-learning courses with limited input. Although the technology exists today to perform these tasks, it cannot perform them without errors, suggesting that instructional designers still have a supervisory role of sorts in this work.
- Level 5: AI Exploration, in which instructional designers work with various AI applications to generate new solutions to the challenges of our job.
This scale is widely used in a variety of educational institutions, from UNESCO to individual schools.
As this discussion suggests, the long-term impact of AI on the work of instructional designers is likely to be incremental, happening in fits and starts: AI affecting a product or process about which we write; one or two job responsibilities adjusted at a time with the introduction of one or two new capabilities aided by AI in the tools we use to perform our jobs.
Many instructional designers are already using AI at some of the levels of this framework and will receive more opportunities to use it at other levels in the next few years. The challenge is not whether or not AI will affect instructional designers’ jobs. It already is. The challenge is how instructional designers will respond and to what extent we will play active roles in shaping our relationship with AI.
References
- Cerulla, Megan. 2023. “A Lawyer Used ChatGPT to Prepare a Court Filing. It Went Horribly Awry.” CBSNews.com, May 29, 2023. Accessed January 14, 2024. https://www.cbsnews.com/news/lawyer-chatgpt-court-filing-avianca/.
- Perkins, Mitch, Furze, Leon; Roe, Jasper, and MacVaugh, Jason. (2024.) “The Artificial Intelligence Assessment Scale (AIAS): A Framework for Ethical Integration of Generative AI in Educational Assessment.” Journal of University Teaching & Learning Practice 21(6). Accessed June 2, 2025. https://open-publishing.org/journals/index.php/jutlp/article/view/810.
- Roy, Abhirup, and White, Joseph. 2024. “At CES, Legacy Automakers Scramble to Keep Up in AI Arms Race.” The Globe and Mail, January 12, 2024. Accessed January 13, 2024. https://www.theglobeandmail.com/drive/article-at-ces-legacy-automakers-scramble-to-keep-up-in-ai-arms-race/.
- Tyrangiel, Josh. “What Sports Illustrated’s BotGate Really Means for Journalism.” Washington Post, December 1, 2023. Accessed December 10, 2023. https://www.washingtonpost.com/opinions/2023/12/01/ai-sports-illustrated-bot-journalism/.
Note: References reformatted in Chicago style with the assistance of Microsoft Word Co-Pilot.
Image credit: Chayada Jeeratheepatanont