Roadmap needed to navigate the edtech landscape

NewsRoadmapEDtech280125.jpg

28 January 2025

The burgeoning national edtech market must be built around high-quality resources to establish a resilient baseline for the rapid infiltration of digital resources and learning applications, say Professor Leslie Loble AM and Dr Kelly Stephens, from University of Technology (UTS) Sydney Centre for Social Justice and Inclusion.

Loble and Stephens are the authors of a new research paper, Towards high quality in Australian educational technology, which raises concerns about the dark side of artificial intelligence (AI), including data sovereignty and safety, equity and inclusion, inherent bias, and commercial interests.

The paper addresses concerns about public school resourcing and teacher workloads, roles and relationships with students, and generative artificial intelligence (GenAI), which is capable of mimicking human content, ideas and data, adds a layer of complexity.

Despite the rapid growth of the market and the proliferating number of publicly available edtech apps, which number around 500,000 on Apple and Google, with more still marketed directly to schools, there is “no independent, comprehensive source of information about the quality of digitally enabled education resources in Australia”, the paper says.

“Schools, teachers, students and their parents can find themselves having to navigate a confusing market without the time, information, or technical expertise they need to answer critical questions like:

  • -Are these tools aligned to the Australian curriculum (or local variants) and to evidence-backed approaches to teaching and learning?
  • -Are they designed to benefit the full range of learners?
  • -Who owns the data and what does that mean for data sovereignty and safety?
  • -Is there evidence that they work, and for whom?

“In worst-case scenarios, edtech is not only ineffectual, but dangerous,” the paper says.

Stephens says robust quality assurance (QA) can alleviate burden from teachers and schools, who should not have responsibility for making detailed and sometimes technical judgements about
a resource’s fitness for purpose.

The need for GenAI literacy and training for leaders, teachers, support staff, students, parents, guardians and policymakers was among the 25 recommendations of a federal parliamentary committee report, Study buddy or influencer, released in September.

GenAI “presents exciting opportunities and yet high-stakes risks for the Australian education system”, the House of Representatives Standing Committee on Employment, Education and Training acknowledged following its inquiry into the use of GenAI.

The recommendations included providing funding to set up virtual and physical hubs to provide expert and technical advice and support to institutions, regulating edtech companies and developers through a system-wide risks-based legal framework, and expediting the implementation of the Australian Framework for GenAI in Schools (released in January).

Loble was an expert advisory panel member for the inquiry and is Chair of the Australian Network for Quality Digital Education (ANQDE), a cross-industry leadership group.

“The good news is that the recommendations are substantively aligned with our QA report, and the committee has specifically called out the need to address the digital and educational divide, as well as safety and security,” she says.

“They recognise the existing risks of these tools, which we need to mitigate, but also the risk of doing nothing – we need to be alert to both to avoid worsening Australia’s learning divide.”

Quality assurance can support systems by providing a national process and avoiding unnecessary duplication of effort by states and territories. But states would still be able to “run their own ruler over a resource” if they wanted to assure themselves of alignment with any particular state-based criteria.

“National quality standards mean this would be a less resource-intensive process if all the fundamentals have already been assessed,” she says.

NSW Teachers Federation deputy president Amber Flohm agrees it would be “untenable to simply assume that school leaders, teachers and support staff possess the technical expertise, time, and resources to manage these risks on their own”.

“Sufficient and effective regulation and scrutiny by education systems and government is the only way to ensure educational integrity, privacy and ethical concerns are balanced against commercial interests as the use of edtech and generative artificial intelligence becomes more widespread,” Flohm says.

From trial to tool

From Term 4, public teachers in NSW will have access to the department of education’s endorsed NSWEduChat GenAI tool, initially trialled for students in response to statewide bans on ChatGPT last year.

The department says the trials, conducted in 50 schools, showed the tool could save time by producing student resources and automating administrative tasks, “giving teachers more time to focus on personalised learning and student interactions”.

“NSWEduChat does not replace the valuable work of our teachers, it helps them to save time, tailor their resources, and focus on their critical work in the classroom,” says education minister Prue Car.

Flohm says NSWEduChat was initially designed to assist with student tasks such as essay writing, and collect data on equity and data privacy, but cautions against the de-professionalisation of teachers.

“When it comes to professional tools for teacher use, available technology should not determine what the solution is and then work back to the problem. Rather teachers should work out what they want AI to do to support their work,” she says.

“The capacity of GenAI to create immediate lesson plans is obvious, and no doubt attractive to a time-poor profession. However, understanding how syllabus, curriculum and the associated pedagogies interact to benefit the growth of students’ knowledge and skills is the core of teachers’ intellectual labour, and this must never be reduced or outsourced to technology.”

Testing the tools

Though work is being done at all levels, national standards are needed, and teachers must be brought in to help with evaluation. They will need to ensure GenAI tools align with their schools' needs, including student literacy and learning levels and backgrounds, and that teacher knowledge and skill is used to turn data into effective classroom practice.

Dr Kelly Stephens says there is “currently nothing in the way of national standards, apart from ESA’s Safe Technology for Schools program, recently updated for GenAI”.

She says evaluation is benefiting from reviews across diverse fields, including by teachers, edtech and learning media experts, child development scholars, instructional designers, K-12 subject matter experts, and school technology leaders.

“Our consultations with teachers have suggested that rather than diminishing the importance of teacher professionalism, edtech highlights it.

“This might include using an online curriculum application to help cater to a very broad range of learning levels in a classroom and rely on their breadth and depth of subject expertise to provide point-in-time support and monitoring of student progress,” says Stephens.

“Or using generative image software to improve engagement with school and learning, build digital literacy and super-charge English language acquisition by recent migrants and refugees.”

Equity and inclusion must remain a significant priority in the evaluation process, particularly as GenAI has the potential to increase disadvantage through cost, literacy and digital access.

“If we drop our guard on this, there is every chance that better resourced students, families, schools and systems will be better equipped to assess, explore, and benefit from existing and emerging digital tools,” says Stephens.

“This absolutely requires adequate and equitable resourcing at the school level. It also invites governments to consider how best to use other levers at their disposal, to bend the market toward equity, such as quality standards and procurement processes.”

Statewide challenges

AEU Victorian Branch vice president, secondary, Marino D’Ortenzio warns that despite the national framework for GenAI, there are different views on its use and implementation between jurisdictions in Australia. “For example, in NSW AI is permitted to be used to create newsletters, whereas in Victoria this is explicitly forbidden in the Victorian government school system policy.”

D’Ortenzio says that as GenAI and machine learning systems become ubiquitous, system-wide training will be vital to prepare staff adequately and schools must be given the means to analyse impact on teacher workload.

“We recognise that GenAI is here and, that students and teachers are using it. This means our approaches to learning tasks have already begun to alter. Teachers must be at the centre of decisions relating to AI and pedagogy in schools as it expands in its scope and use,” he says.

“We know of schools that are changing the way they approach tasks to ensure that GenAI does not give students who use it an advantage. Some are returning to hand-written assessment pieces. Others are setting tasks that assume GenAI is going to be used, by getting students to identify how they might ask a GenAI model to produce a result, and then analysing the result to examine where they are flawed.

“The department of education and training must be accountable for the implementation, use and decisions of GenAI in schools. This accountability should be set out in clear, publicly available guidelines for schools and their communities.”

D’Ortenzio also says commercial businesses who see an opportunity for profit making must be deprioritised behind educational programs, pedagogical models, student development and student achievement.

Ad-hoc regulation

Use of AI technology in Queensland remains ad-hoc and regulation of platforms and guidelines for digital technology have not kept pace with change, says Queensland Teachers’ Union honorary vice president Josh Cleary.

“There is an urgent need for the profession to adopt a decision-making framework and ensure there is industrial consultation that addresses the full suite of legal, professional and educational issues,” he says.

When the Queensland Department of Education began consultation in 2020 it assumed teachers would familiarise themselves with new digital technologies outside of working hours.

“The QTU successfully negotiated an allocation of additional funds for the purpose of releasing teachers to undertake training. The rollout of the professional training was not perfect, but the approach to consultation between the parties has significantly improved,” he says.

Excessive data entry and unreasonable quantities of email are two common examples of work intensification that detract from teachers’ time to plan, implement, and evaluate effective teaching and learning practices, and the use of AI has so far added to teacher workloads rather than allow teachers to focus on what they do best: teaching students.

“A future-focused pedagogy might use GenAI technology as a platform, but classrooms should not become subordinate to technology’s use. Teachers must be given training to help them ensure students learn to maintain a critical awareness of information and make discerning choices about the use of GenAI,” Cleary says.

This article was originally published in the Australian Educator, Summer 2024