Keeping it Human: AI and the future of teaching and learning at TAFE


09 January 2024

We will all be ‘prompt engineers’ in the future. Generative artificial intelligence (AI) tools are poised to transform education, industry and society. They will affect nearly every aspect of work and life for TAFE teachers, support staff and students.

It is a matter of urgency, that the TAFE sector, with its focus on industry training, is properly informed, resourced and upskilled to use AI. Dealing with TAFE’s critical teacher shortage, and improving student access to the foundation studies that provide the basics, may not seem at first to have much to do with AI, but these are key elements in ensuring these new technologies will be used not only safely, ethically and equitably, but also with confidence so students can fully participate in economic and social life.

The very nature of these iterative technologies means that changes will be rapid and unpredictable. If unregulated, and driven purely by commerce, AI could further entrench the digital divide, emphasising the lack of access to basic communications and information technology (CIT) that’s still a problem in many of Australia’s regional and remote areas.

Students in Australia are already slipping down the OECD rankings in basic literacy and numeracy, but these, along with critical thinking, creativity and analytical skills, will be the currency of a future with AI, whether that’s in manufacturing, agriculture, healthcare, construction or mining.

The urgent need for new laws and educational frameworks

With some studies suggesting more than half of all high school and university students are using ChatGPT, a tool only released in November 2022, there is clear need to get the legislative and pedagogical frameworks in place to deal with AI – and quickly, before the cat is completely out of the bag.

There were more than 90 submissions to the current parliamentary Inquiry into the use of Generative Artificial Intelligence in the Australian Education System. These came from private and public schools, universities, teachers and unions, including the AEU.

While many expressed excitement about AI’s potential to personalise learning and assist teaching, the general tenor of submissions is cautious. There is a focus on protecting students’ data privacy, using ethically developed AI tools – including those that respect Indigenous data sovereignty, and most importantly of all, privileging the judgement of educators, and the essential human relationship between students and teachers.

Stay calm, be curious

“We owe it to students, many of whom are already experimenting with these tools, not to hide from AI,” says Jeannie Paterson, professor of law and co-director of the Centre for AI and Digital Ethics (CAIDE) at the University of Melbourne.

“As someone involved in teaching in the technology space, and lecturing about generative AI, I see a lot of anxiety and fear. People lack confidence about trying it out for themselves. It would be useful if we could provide non-judgemental training on how to use the technology, and what its limitations are, so people can work out where they stand in these debates and not just leave it all up to the so-called ‘experts’.”

Paterson, whose research covers consumer and data protection law with a focus on the ethics and regulation of new digital technologies, recently authored a report about automated mental health and wellbeing apps. She concluded that while these apps might offer limited support to some people in some circumstances, there was potential for great harm, including data harvesting of sensitive information, and providing non-professional ‘health advice’ to vulnerable people. She identified a need for scrutiny and oversight by trained human professionals, as well as the robust enforcement of relevant consumer privacy and protection laws.

Paterson says that privacy is the central issue for AI in education, and that part of CAIDE’s submission to the Inquiry into the Responsible Use of AI, calls for some kind of AI Safety Commissioner, along the lines of the Australian e-Safety Commissioner, which is a world-first initiative.

How AI affects the we way we teach and learn

Generative AI uses computers to do things that traditionally required human intelligence, using vast data sets and algorithms. It’s important to remember that AI models are only as good as the quantity and quality of their training data, so existing biases or inadequacies will continue.

Tools including ChatGPT, DALL-E, Midjourney and Canva AI respond to human prompts, processing large sets of data, recognising patterns, making decisions and then improving from these interactions over time in a sometimes disarming ‘human-like’ way.

ChatGPT, for instance, can answer complex questions with convincing answers. It can write essays, which raises serious questions about cheating by students. It can draft lesson plans and assist with one-on-one tutoring, including rewriting existing software or pieces of text for different learning levels.

Image-generating AI like Midjourney also uses text prompts to create images, diagrams or animations, adapting and offering alternatives as requests are refined.

Smartcopying, the official guide to copyright issues for Australian schools and TAFE, gives the following examples of how TAFE teachers may use AI to create new teaching works:

  • Multiple-choice quizzes as part of a Certificate IV in Laboratory Skills, with questions increasing in difficulty.
  • Meal plans with step-by-step recipes as part of a Diploma of Nutrition.
  • An image in the style of Andy Warhol as part of a Certificate III in Visual Arts.

Smartcopying suggests teachers might also use AI to restore and refresh older learning materials. For example:

  • Updating an accounting course.
  • Supporting a struggling student with personalised tutoring assistance outside the classroom.
  • Re-writing a piece of text in ‘easy English’ to assist a student with reading difficulties.
  • Creating accessible versions of text for students with a disability.

In an ideal future, AI will help teachers mark assignments, draft emails to students and reduce admin so there’s more time for that precious face-to-face contact. Students will be educated as critical thinkers and ‘co-pilots’ with AI tools, using them with a dose of healthy scepticism rather than as some kind of ‘oracle’.

The not so rosy reality

Any talk of upskilling TAFE teachers around AI seems fanciful when those teachers are already critically overworked and short-staffed and given so little professional development.

Elaine Gillespie, vice president of TAFE & Adult Education Provision at the AEU Victorian Branch says: “TAFE is very much applied learning and it’s relationship bound. A lot of students come to us for particular reasons, because they weren’t happy in other kinds of education, or they want to be hands on. You won’t get that with AI. We need to rebuild teaching in TAFE so we can support students better.

“A lot of the materials currently generated by AI are gobbledygook and just cause more confusion for students and teachers,” says Gillespie.

“Some TAFE campuses have used AI-generated resources as a quick fix, without running them by teachers. I’ve been talking to a cross-section of different professions, from electricians, to nursing, to foundation studies, and most of them say that where their TAFE tried to create resources using AI, it’s actually created more work in the long run, in terms of fixing inaccuracies, along with plagiarism and copyright issues. They say it would have been better for teachers to just be given the time to do it properly from the beginning.”

A highly experienced TAFE teacher, Gillespie also has special expertise in the disability sector. She worries about the use of AI in courses and industries where special care is involved.

“This is where people are at their most vulnerable – nursing, aged care and disability. I would hate to think corners were cut there," she says, cautioning of knowledge gaps.

“New technologies can be transformative, and we use them all the time – virtual reality simulations for welding, for instance, allowed students to practise without wastage or danger. But as teachers, we need to be trained and given the time to learn how to use these new tools.”

Maintaining human supervision by qualified teachers in TAFE is essential, says Gillespie, not just for safety, but for maintaining the value of the qualifications themselves.

“I worry that disreputable providers might see AI as a quick cheap way to get students through courses. But the end result could be very harmful, both for the student themselves who has paid money for their course and won’t be employable, and for the field. We’re already seeing some of these issues in construction for instance, where buildings are so poorly constructed because you’ve got a first-year apprentice being supervised by a second-year apprentice, instead of a fully qualified TAFE person.”

It’s not about cheating, but understanding bias

Professor Paterson says we need to be very cautious about using ‘anti-cheating AI tools’ to detect the use of AI in student work.

“All the studies that I’ve seen say this won’t work, but it will discriminate against certain types of students because it looks at complexity of language. I’d just like us to remember that for young people, being accused of cheating can be devastating, so let’s exercise care on all sides,” she says.

For teachers, real concerns lie in the bias of how AI tools function, in that they are reflective of dominant cultures and skewed heavily towards male perspectives – so how can we ensure AI is a safe space for all students while preparing them for the jobs of the future? Teaching students how to use the technology safely, how to question it, how to recognise bias, to check their outputs, and critically understand how it is working will be essential for student success and for the workforce needs to come. And soon.

Article by Rochelle Siemienowicz