The recent developments in artificial intelligence (AI) and other emerging technologies, such as data analysis and virtual reality, create challenges for the higher education sector. One challenge is preparing students for the demands of workplaces that are more technology-reliant than ever, and to do this we must integrate these technologies into the curriculum. Here are three ways to do it.
1. Use emerging technologies in the classroom
During the pandemic, the use of new technologies in the learning process became normalised – whether it was Zoom, Teams or online learning platforms like Aula, Brightspace and Google Classroom. While many of us have moved back into the in-person classroom, it’s still worth keeping abreast of technological developments and integrating them into your teaching.
- AI communities of practice can help us face its challenges head on
- AI-aware pedagogy for business courses
- The (AI) sky isn’t falling
As an example, when ChatGPT emerged in 2022, much of the media coverage concerned its negative attributes – specifically how it could be used to cheat. It’s unrealistic to expect students to be aware of tools such as ChatGPT, and other AI generative tools, and not make use of them.
Our role as academics is to guide our students to use these tools in the appropriate way, one that does not constitute plagiarism or non-creativity. When used correctly, these tools can enhance critical thinking and the learning experience. They can be used to automate tasks, ultimately helping to increase productivity and promote adaptive learning.
For example, when I ask students to work on a case study using ChatGPT, I break this down into two subtasks – one to generate the output and the next to validate it and make modifications if misinformation is identified. We have to be realistic, and accept that ChatGPT and similar AI tools will be, and are being, used in this way in the real world. It’s important that students approach this critically, employ literature research skills, evaluate output and make adjustments.
2. Demonstrate its real-world significance
AI is not limited to computer science. It intersects with various fields such as healthcare, business, finance and civil engineering. While integrating AI and other emerging technologies into teaching is important, it’s also vital that we apply these concepts and technologies into real-world scenarios to show its relevance and significance.
Teaching students who are not destined for a career in IT about technology can be challenging. Which aspects of this do they really need to know? What vocabulary should we use? What skills do we need as academics to teach this in an effective way?
Based on my own experiences in teaching business and management students in Canadian and Scottish universities, there is no substitute for active learning. I bring real-life examples of how technology is used into the classroom and build activities around them.
For example, when explaining to business students the concept of information system technology, essentially tools and systems used to manage systems efficiency, I aim to show, rather than simply tell. How these systems are built and used by a company isn’t obvious – and can even be a little bit abstract. To aid understanding, I make it practical – using the university information system, something they’re very familiar with, as a concrete example. Students are tasked with thinking critically and being strategic, and they are also encouraged to be curious and to ask questions on how these technologies work.
Bringing practical, real-life examples of how technology is used in industry into the classroom, with activities built around them, is invaluable. Collaborations with industry partners are also helpful in demonstrating these technologies. And partnerships with professionals working in or with AI, robotics and virtual reality provide opportunities for guest lecturers, mentorship and even access to real-world projects, exposing students to industry practices and trends.
3. Integrate future technologies into your teaching
Technology is evolving at a dizzying pace, creating a scenario in which what you teach in first year may be outdated or even irrelevant by the time a student reaches the final year of their degree. It’s important to keep one eye on the future, something that is not as impossible as it might sound.
Major companies such as Apple, Google and Meta are investing vast amounts of money in the metaverse, an immersive online world that exists alongside our own. Presently, it only exists as an idea, although it is widely accepted among technology professionals that this will be a key part of our future – and one that offers big economic opportunities. With that in mind, as a digital business lecturer, it’s worth making sure my students are familiar with this particular technology and that they feel prepared for when it comes into common usage. At UWS, we teach and use the metaverse in a few modules and make sure students comprehend its benefits as well as its challenges.
In many situations, we know what the next big technology or development will be, or is likely to be – and it’s helpful to discuss this in your teaching, alongside any ethical or moral implications, should this be relevant.
To integrate emerging technologies into the curriculum, educators must adopt a proactive and forward-thinking approach. Embedding these technologies in your teaching can enhance learning experiences and prepare students for a workforce that will increasingly revolve around new technologies. By acknowledging and, in some cases, embracing these advancements, we can help to equip students with the skills and knowledge needed to thrive and adapt in a workplace that is evolving constantly, significantly – and quickly.
Sabrina Azzi is a lecturer in digital business at the University of the West of Scotland.
If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the THE Campus newsletter.
comment