We are education technology experts.

Skip to main content
Blogs - AI for Learning

The Role of AI in Course Development and How It Potentially Impacts Learners

  • 30 May, 2023
  • Reading Time: 9 mins

With the rise of Large Language Models (LLM)  like ChatGPT, Palm2, Anthropic, and many others, there is no doubt that generative AI solutions wrapped around these LLMs will be used to assist with the creation of an array of products, which includes the development of educational products, platforms, and courses that learners will learn from.

It is obvious to anyone that has played with solutions like ChatGPT from OpenAI, that creating vast amounts of content can be produced quickly and with a high degree of accuracy. These AI tools can wow you with amazement and also add fear to the notion of being replaced by AI. This article doesn’t cover the eventuality that very talented people may be replaced by AI, rather it focuses currently on the benefits of using AI to help with the development of course content and what to look out for.

First let’s start with some of the benefits you will find from using AI solutions like ChatGPT for content development, starting with improved efficiency.

As you may have experienced with generative AI bots, the speed with which you can get a full and accurate response is short of mind-blowing and the responses from your prompt requests are only going to get better, faster, and more precise as these solutions are improved and their language models have learned. Today AI can be used to develop the entire lesson plan, outlining it to specific outcomes and or to specific examinations that are tied to specific standards or compliance requirements all from a few simple prompts. This level of efficiency is very difficult for a single person or even a team of people to achieve in a few hours and the productivity gained by using this AI will be so measurable in terms of time savings, that anyone not using it will be left behind.

Supplementary learning aids that help with reinforcement learning like videos, animations, assessments, and or games can also be developed leveraging generative AI solutions with speed and accuracy. For video and simulations, the AI solutions available now are not all that good currently, however, at the acceleration that these AI are improving, there might be dozens of solutions available soon to help achieve the development of high-fidelity assets.

In addition to the efficiencies gained for the development of the course materials, AI agents can be used to help with the review and QA of the content and course materials to be sure that the course materials are fulfilling specific outcomes and are following established course standards, thus improving the efficacy of the content. These agents can be deployed with little knowledge of how AI agents work and can be fine-tuned to work with the course materials you are trying to create. For example, Google has Palm and within Palm, there are specific LLM datasets geared to support specific learning topics and or professions such as MedPalm; thus increasing the accuracy of the content the generative AI has produced.

Now one could argue that the AI might be able to output content though it might not fully understand the nuances of learning design models, however, this not be the case. These generative AI systems and their corresponding LLMS have a full understanding of most learning design standards and can be trained rather quickly to learn new ones. The course materials you produce will be sound and can be mapped to multiple learning design models and taxonomies.

If you then launch these onto an LMS that allows an AI to understand how students are interacting with the content or providing access to student performance data from assessments and assignments, you can then train the AI to help influence changes to future versions of your course, to increase student performance and their learning outcomes.  Improving your course and the student’s experience plus outcomes is now more achievable than ever before.

Now you may be saying, well how do I know that the content and course materials are accurate? Why would I put my trust in something that is entirely AI-generated and why should a student? The reality is, you should not, at least for now. As part of your team, you will still need to have writers, reviewers, and subject matter experts. It might also help to have an AI expert on your team, not necessarily someone that can engineer AI, though someone that might be aware of the challenges you might have in producing content with AI. Writers will be necessary to review grammar and to be sure that the AI has outputted content that is aligned to the subject/topic with appropriate context. It is currently a flaw with these AI solutions that they at times “hallucinate”, providing you with content that has no reference and is made up out of thin air.

As you can see the benefits from improved efficiency are great when you properly leverage AI solutions for the development of courses and content in general.  With that, you have other huge benefits which typically equate to:

  • Faster time to market
  • Likelihood of faster to higher adoption rates
  • The ability to change materials based on student performance and feedback
  • Possible reduction in overhead to produce content
  • The ability to reference vast information resources that the LLMS are known for
  • And perhaps improved profitability

Obviously, these are all appetizing reasons to use AI, particularly for those that crunch numbers and look at balance sheets all day, and this is the reality we are faced with today. These LLMS and AI solutions can make many job roles and functions so efficient that corporate CFOs and the companies that finance them will expect AI to be used for the development of many products. This means that we all have to adapt and perhaps this will allow us more time to create many more products and experiences that were challenging to resource due to time and or lack of financing to make those other products and learning experiences a reality.

As mentioned earlier in this article, producing content of any kind using AI can have a few shortcomings, particularly with grammar, syntax, style, and accuracy. We already talked about “hallucinations” and how these generative AI solutions can create false and or inaccurate information. In addition to this, generative AI solutions can often create what can be perceived as robotic or non-human-sounding content.  This is where a good writer can exploit their skills, taking that first pass at AI-generated content, to then re-write the content making it more human and ideal to read for the learner.

Another issue that often comes up is machine bias, where the AI generates content with a bias that could impact the learner. These biases are not developed in a vacuum by the AI, since currently these biases mostly stem from the datasets that are used and or the algorithms the AI was developed with.

In regards to LLMS, biases may be caused by a combination of both the dataset used and the algorithm, plus the humans that are training these systems. LLMS are trained using what is called Reinforcement Learning from Human Feedback (RHLF) which is a form of supervised learning, that helps the AI learn better, potentially catching and or correcting errors that the AI might have with deduction and reasoning. This method of training is not always perfect and probably may never be, so having human resources available to review the AI-generated course content will be necessary to ensure that you are limiting any biases that could have a negative impact on the learner.

A distinct problem that many CEOs and IP Lawyers are concerned about is IP ownership. If the generative AI and LLM created the content, who then owns the IP?  Truth is, this is a very hard issue to solve and it may take a few years for this problem to be fully addressed if ever.  LLMS learn from information and that information is everywhere and from every interaction one has with the LLM. With the plethora of knowledge and information available on the internet and available open libraries, the LLMS have tremendous access to vast amounts of information helping inform the AI how to properly respond, in what language and structure.

The IP issue has sparked lawsuits against companies like OpenAI from Reddit and even threats from Elon Musk (who now owns Twitter) since these LLMS also read through millions of posts a day to learn from. This has caused not only an IP concern but has created another concern about data privacy which ultimately can impact the learner as well.

With just those two reasons of IP and Data Privacy, many companies may opt out of using generative AI solutions to create content. Thus, your skill may still be needed to create your course content the proven and old-fashioned way.

To mention, there are now ways to get the power of a LLM and keep your data and IP protected.  Just recently, OpenAI announced that they were going to enable a feature to ChatGPT to keep your chat conversations and outputs private. In addition, there are now many GPT solutions like PrivatGPT that can run locally on your own device that can be trained with fine-tuned models from organizations like HuggingFace, reducing the chances that your IP and or Data will become public. These SLM (small language model) solutions and GPTs might not be as robust as the larger LLM counterparts, but they look like they will get the job done, to learn and to help you create content from generative AI on your own networks and or devices. Granted there are some challenges with accuracy and costs to face with going this route, which is why you will still need a content SME on hand.

The future of generative AI and general AI is here and is here to stay not unless we have a cataclysmic event that stops it. These solutions that are available today are only scratching the surface of what AI has the potential to solve, and to potentially replace, which in the end maybe some of us. For the time being, we can fight the battle or learn to adapt to it, exploiting what we can, while we can. What no one knows for sure is what the future will be like. What we have is a lens to the next few years and what knowledge and learning mean today.

A client and friend of mine said recently, “Knowledge is passé. In the future, we might not have a need for knowledge since knowledge is at your fingertips with these LLM and AI. What you will need is an understanding of concepts and how that is tied to knowledge that is readily available at your fingertips.“

Ultimately, how, what, and when we learn will change and will be changing more rapidly now. As it changes, I hope to be able to provide you with more advice on what may be the best ways to adapt as both the creator and the learner.

Sources:

Ray Alba
WRITTEN BY

Ray Alba

Explore the latest insights

eLearning and digital learning solutions

Get In Touch