Nearly 80% of Australian students are already using generative AI – but should they be?
Nearly 80% of Australian students are already using generative AI – but should they be?
In this episode, Celeste Fourie and David Karsten are joined by Alex Jenkins, Director of the WA Data Science Innovation Hub to discuss the future of artificial intelligence in education.
From AI-powered tutoring to the risks of cognitive offloading, this episode examines how students and educators can collaborate with AI while preserving critical thinking.
Alex Jenkins, Director of the WA Data Science Innovation Hub
Alex is an artificial Intelligence specialist and experienced technology leader. He has more than 15 years of experience working in the technology space where he implements and advocates for analytics, data science and artificial intelligence solutions.
This podcast is brought to you by Curtin University. Curtin is a global university known for its commitment to making positive change happen through high-impact research, strong industry partnerships and practical teaching.
If you liked this episode, why not explore our Master of Artificial Intelligence or one-year Graduate Diploma in Education.
Email thefutureof@curtin.edu.au
Hosts: Celeste Fourie and David Karsten
Content Creator and recordist: Caitlin Crowley
Producer: Emilia Jolakoska
Executive Producers: Anita Shore and Natasha Weeks
Curtin University acknowledges Aboriginal and Torres Strait Islander people, the First Peoples of this place we call Australia, and the First Nations peoples connected with our global campuses. We are committed to working in partnership with Custodians and Owners to strengthen and embed First Nations’ voices and perspectives in our decision-making, now and into the future.
Curtin University supports academic freedom of speech. The views expressed in The Future Of podcast may not reflect those of Curtin University.
00:00:00:05 – 00:00:10:01
David Karsten
This is The Future Of, where experts share their vision of the future and how their work is helping shape it for the better. I’m David Karsten.
00:00:10:03 – 00:00:11:16
Celeste Fourie
And I’m Celeste Fourie.
00:00:11:18 – 00:00:33:22
David Karsten
Every day, AI becomes a bigger part of our lives – shaping the feeds we scroll, the way we write emails, and even the music we listen to. But how does this technology fit into classrooms? Well, with nearly 80% of Australian students already using generative AI tools like ChatGPT and Microsoft Copilot, the way we learn is changing, whether we’re ready or not.
00:00:33:24 – 00:01:00:18
Celeste Fourie
In this episode, we were joined by Alex Jenkins, Director of the Data Science Innovation Hub and an artificial intelligence specialist. We discussed what to expect in education when AI becomes a cornerstone of the classroom, the risks of AI use – like cognitive offloading – and advice for students and educators using these tools.
If you’d like to learn more about this topic, you can visit the links in the show notes.
00:01:00:20 – 00:01:14:08
Celeste Fourie
Alex, you’ve worked in the technology space for more than 15 years. When did you first realise AI had the potential to change the way we create, share and build knowledge?
00:01:14:10 – 00:01:37:13
Alex Jenkins
Wow, what a great question. Easy one to start off with – thanks for asking.
I’ve been paying attention to what’s been happening in neural networks for a very, very long time. There was a breakthrough in something called deep learning around 2011 or 2012, when people had been experimenting with this early version of AI called neural networks.
00:01:37:19 – 00:01:48:09
Alex Jenkins
Neural networks had been around for a long time, but it was only in the early 2010s that we started to get the technology to really be able to run these things at scale.
00:01:48:09 – 00:01:52:05
David Karsten
Neural networks – you might have to refresh us on that one there.
00:01:52:11 – 00:02:15:16
Alex Jenkins
So a neural network is a type of software designed to emulate, at a very simple level, how the brain works. It’s a different way of running computer software.
There are things that humans have always been good at – like handwriting recognition – that are very, very difficult for computers, as well as understanding the intent of language.
00:02:15:18 – 00:02:44:16
Alex Jenkins
Computers have been terrible at that for about 70 years. We’ve been trying to get them to do it for decades, and it’s mostly been a failure.
But neural networks have been around for a long time, and we eventually started to get the computing power to grow them and grow them and grow them. Over the next ten years, these systems became more and more capable of doing human–like tasks – things like image recognition and transcribing audio into text.
00:02:44:18 – 00:03:24:00
Alex Jenkins
Then finally, around 2022 or 2023, we saw what we call GPTs. These are large electronic brains that hoover up information from across the web. They started to show characteristics of being able to understand language, answer questions and even learn.
When ChatGPT was released, I knew at that moment – my goodness – this was the point where AI moved from a weird, niche research area into something that was going to explode into the mainstream.
00:03:24:02 – 00:03:28:00
Alex Jenkins
It’s an incredible time for this technology. And the question now is: what do we do with it?
00:03:28:02 – 00:03:55:16
David Karsten
Even back in 2018, when your organisation kicked off, it was estimated at the time that the economic impact in WA alone of disruptive digital and internet technologies was forecast to be around $25 billion by this year.
How far off was that estimate, do you reckon, now that generative AI has really taken hold over the last couple of years?
00:03:55:20 – 00:04:22:06
Alex Jenkins
Yeah – it changes the estimate by an order of magnitude.
Now we’re talking trillions of dollars globally, and certainly many hundreds of billions of dollars for Australia. It’s going to change industry in every sector, but in different ways.
I think the economic value might be slower to realise than a lot of people think, because it’s not easy to integrate this technology into organisations.
00:04:22:06 – 00:04:33:06
Alex Jenkins
Even if development stopped right now, it would still take years to properly integrate this level of intelligence into our current systems and ways of working.
00:04:33:08 – 00:04:53:11
David Karsten
One area where adoption has been really rapid is education. The take–up has been remarkable.
How is the sector being disrupted by AI, in your opinion? And where do you see it taking classrooms by, say, 2035?
00:04:53:13 – 00:05:21:01
Alex Jenkins
This is my big thing – and the area I’m most excited about.
I gave a TED Talk in 2023 called An AI Tutor for Every Child. I’m a big fan of Sal Khan and the Khan Academy, which is the world’s largest educational nonprofit. Throughout the 2010s and into the 2020s, Khan Academy created online class modules for students to work through.
00:05:21:03 – 00:05:51:24
Alex Jenkins
They made it all free, for anyone to use anywhere in the world. It’s a very democratic way of sharing education.
Sal Khan’s big idea was a concept called mastery learning. If you think about how a normal classroom works – say we’re studying maths – we go into a classroom, we learn about fractions for three weeks, and then we move on to number properties or something else.
00:05:52:03 – 00:06:14:16
Alex Jenkins
The whole class starts fractions at the same time and finishes fractions at the same time, and then everyone moves on. But the reality is that students don’t learn at the same rate. Everyone learns in their own time, especially in areas like mathematics.
What we see is that students don’t gain complete mastery over topics, and over time they lose confidence in learning new things.
00:06:14:16 – 00:06:24:02
Alex Jenkins
It’s like building a house on dodgy foundations – it really diminishes what you can build on top. A lot of students end up saying, “I can’t do maths,” and they internalise that.
00:06:24:03 – 00:06:36:13
David Karsten
This is very personal to me, Alex. Are you talking about me?
My primary and high school experience – it’s quite triggering, actually. I think I need a moment.
00:06:36:15 – 00:07:02:07
Alex Jenkins
I’ll give you that moment.
It’s almost universal, and it’s a failure of our education systems over a long time that we haven’t had the capability to let students learn at their own pace. The idea of a single teacher in a classroom of about 30 kids dates back to the industrial revolution.
00:07:02:07 – 00:07:22:21
Alex Jenkins
It’s a very old model, and we haven’t updated it often enough. We know that when students are tutored one–on–one – I tutored high school students in maths when I was at university – their performance improves significantly.
00:07:22:21 – 00:07:45:04
Alex Jenkins
You can turn a good student into an excellent student, and a struggling student into a good one.
Before AI, we never had the ability to say, “Let’s give every child a tutor.” As a society, we simply didn’t have the capacity to value or resource education in that way.
00:07:45:06 – 00:08:10:10
Alex Jenkins
However, when AI came about, it quickly became apparent that it was capable of explaining things – and explaining them again – without ever losing patience, and working through problems with you.
All of a sudden, it’s like, oh my gosh, this is a technology we can deploy at scale that can understand each student’s strengths and weaknesses, and how they like to learn.
00:08:10:12 – 00:08:33:24
Alex Jenkins
We can use this to tutor students and make sure they understand a concept before moving on to the next topic. And we know that when students – particularly in primary and secondary school – are allowed to truly master what they’re learning, they do so much better.
So my hope is that we can deploy this at scale.
00:08:33:24 – 00:08:45:22
Alex Jenkins
By 2035, we have a vision of teaching supported by a computer or an iPad that essentially acts as a personal tutor, guiding us through our educational journey.
00:08:45:24 – 00:09:16:23
Celeste Fourie
I wish I’d had that when I was at school. When I was doing my maths homework and preparing for exams, I had a tutor because I couldn’t always keep up with where my classmates were at.
If I’d had this back then, I think it would have been amazing. When I’m talking specifically about an AI tool, would I have been able to take a photo of a maths equation and put that into an AI model?
00:09:16:23 – 00:09:18:16
Celeste Fourie
And it would be able to solve it for me?
00:09:18:16 – 00:09:21:05
Alex Jenkins
Yeah – the models today are good enough to do that.
00:09:21:05 – 00:09:22:06
Celeste Fourie
Can analyse images?
00:09:22:06 – 00:09:43:08
Alex Jenkins
Yep. Absolutely. They can look at an image.
One of the best examples – and I’ll share a link with your audience – is where students share their screen on an iPad and work through a problem. There’s a great demo from OpenAI where Sal Khan’s son works through a trigonometry problem with an AI.
00:09:43:10 – 00:09:52:02
Alex Jenkins
That kind of interactive, handwritten experience is much better than just typing answers back and forth.
00:09:52:04 – 00:10:00:03
Celeste Fourie
Do you think AI will eventually change what we teach – not just how we teach it?
00:10:00:05 – 00:10:26:05
Alex Jenkins
Yeah, I do – and for an interesting reason.
If you zoom out and look at education over the last few hundred years, not just the last few years, our expectations of what people should learn have changed dramatically. It used to be that only the very wealthy and the elite had tutors or anything resembling modern schooling.
00:10:26:07 – 00:10:47:15
Alex Jenkins
For a long time, most people were considered incapable of reading – we genuinely believed people were too dumb to read.
Even today, there are parts of the world where women are still not educated to the same level as men. What we expect a school leaver to understand has changed significantly over time.
00:10:47:17 – 00:11:08:20
Alex Jenkins
Historically, education standards have always been shaped by context. So this idea that we leave school with a basic understanding of algebra and can write an essay – these are somewhat arbitrary benchmarks.
It’s going to be really interesting to see how artificial intelligence allows us to push those standards further.
00:11:08:22 – 00:11:24:01
Alex Jenkins
There are already parts of the world where students leave high school with a much deeper understanding of mathematics than they do in Australia. It’ll be fascinating to see what this technology can do in terms of lifting educational outcomes.
00:11:24:03 – 00:11:48:10
David Karsten
Let’s talk about leaving school and moving into tertiary education. You’re based here at Curtin, so this is very current.
We’re already seeing the effects of generative AI on policy here – questions around where it can be used in academic research and student assignments.
00:11:48:12 – 00:12:00:17
David Karsten
How do you see this taking shape? What’s the status at the moment, and where is it heading?
Let’s start at the student level, then move to academics.
00:12:00:19 – 00:12:18:10
Alex Jenkins
With students, the biggest challenge initially is assessment.
I used to be able to give my students an essay on the history of World War II, give them two weeks to complete it, and then mark that essay. Now we live in a world where an AI can generate a high–quality essay in 30 seconds.
00:12:18:10 – 00:12:39:04
Alex Jenkins
Universities around the world – Curtin included – began using tools like Turnitin to try to detect AI–generated content. But that quickly became an arms race. These tools are never 100% reliable.
00:12:39:04 – 00:13:07:15
Alex Jenkins
They’re trained on human writing to emulate human writing. You end up with students accused of cheating when they haven’t, and students who have cheated slipping through the net.
It’s often enough to tweak a few words for an AI detector’s confidence score to change completely. Assessment has become a real challenge, both in schools and in the tertiary sector, since the introduction of tools like ChatGPT.
00:13:07:17 – 00:13:31:03
Alex Jenkins
It requires a proper rethinking of how we assess learning. Curtin is running a program called Assessment 2030 – Molly does a great job with that – which asks: what do we want students to leave university with, and how do we assess those skills meaningfully?
00:13:31:05 – 00:13:42:21
Alex Jenkins
We’re already seeing things like the return of oral examinations. Ultimately, assessment methods need to change, because the technology isn’t going away – and students will be using it in the workforce.
00:13:42:21 – 00:14:12:11
David Karsten
It’s such a loaded question, isn’t it? AI is widely used in the workplace, but in tertiary education it almost comes down to a personal choice at the moment – how much value you want to get out of your education as a student.
00:14:12:11 – 00:14:19:15
David Karsten
As the guardrails and policies are still being developed, it really becomes a question of, how much do I want to get out of this degree? Is this work coming from me, or from other sources?
00:14:19:15 – 00:14:41:17
Alex Jenkins
There is definitely an onus on students to take responsibility and ask themselves whether they’re going to engage properly, bend the rules, or outright cheat.
But we also need to look past that framing, because at the moment it turns AI use into a choice rather than a learning tool.
00:14:41:19 – 00:14:52:00
Alex Jenkins
What we should be doing is teaching students how to use AI to support their learning, rather than using it for what’s called cognitive offloading –
00:14:52:02 – 00:14:52:09
David Karsten
Yeah.
00:14:52:14 – 00:15:07:21
Alex Jenkins
–which is essentially getting the AI to do the thinking for you.
My spelling has gotten worse since Google and spellcheck. My navigation skills have declined since Google Maps. We don’t want critical thinking to be offloaded to artificial intelligence.
00:15:07:21 – 00:15:22:03
Alex Jenkins
We need to structure courses so students leave with strong skills, where learning happens with AI – not instead of learning.
00:15:22:08 – 00:15:28:15
David Karsten
So Alex, when you say 2030 – is that when we’ll see a workable policy?
00:15:28:15 – 00:15:50:10
Alex Jenkins
Oh no – that’s just the name of the program. Assessment 2030 is a university–wide review of assessment.
What I’d love to see are exams that look very different. Imagine coming into an exam, sitting down at a university–provided computer, and being examined with AI as part of the process.
00:15:50:12 – 00:16:17:07
Alex Jenkins
I was just reading a paper this morning where people were testing whether AI could grade papers – law papers, student papers – at the same level as a law professor. And in certain cases, it can. So the technology is moving in that direction, and we should make use of it.
If you have an AI set up that’s designed to test your knowledge and your understanding, that’s a new form of assessment that I don’t think anyone’s really experimented with yet.
00:16:17:13 – 00:16:27:08
Celeste Fourie
Talking about cognitive offloading, what are some of the consequences that come with this overreliance on AI?
00:16:27:10 – 00:16:47:21
Alex Jenkins
Oh yeah. I mean, I get emails sometimes from people, and they send me a report they’ve created – and it’s clearly just straight out of ChatGPT. You know, copy and paste. It’ll be 17 pages long. And I look at it and think: what’s the point of this? What are you trying to communicate?
00:16:47:23 – 00:17:12:21
Alex Jenkins
And I don’t think AI will replace those important interpersonal skills – the communication skills. The ability to formulate an argument and to critically think about things… that’s what we’re at university to do. To lose that – I mean, I’m not sure if you’ve seen the film Idiocracy. It’s a bit old now, but… yeah. That’s the worst–case scenario.
00:17:12:23 – 00:17:25:21
Alex Jenkins
I don’t think people will rely on AI for everything, but it will be very tempting to hand off that critical thinking ability. And that’s not a world we want to live in.
00:17:25:23 – 00:17:45:07
David Karsten
Idiocracy – that is a frightening glimpse into what’s possible.
On the other side of all of this: the tech companies creating these generative AI programs – their interests have got to be purely financial, yeah?
00:17:45:07 – 00:17:46:06
Alex Jenkins
Right. Yeah.
00:17:46:07 – 00:17:58:04
David Karsten
Is there ever going to be such a thing as ethical AI that prioritises students and educators, rather than just engagement and mining data?
00:17:58:06 – 00:18:20:17
Alex Jenkins
Yeah. That’s a big problem. The incentives of the companies providing this technology aren’t necessarily aligned with the incentives of a university, or the interests of a student.
There are some software–specific things you can do. For example, if you’re using ChatGPT, there’s a study–and–learn mode. And I think Google Gemini has something similar.
00:18:20:19 – 00:18:37:16
Alex Jenkins
And even if it doesn’t have those modes, you can – again, you have to have the discipline as a student – go in and say: “Look, I’m here to learn this. If I ask for help, don’t give me the answer. Just point me in the right direction.” You can do those things.
00:18:37:16 – 00:19:00:20
Alex Jenkins
But at the fundamental level, there’s incentive misalignment from the big corporate players.
One thing we can do is develop education–specific AIs – we can have Australia–specific artificial intelligences. I think the future… the best possible future–
00:19:01:00 – 00:19:20:06
Alex Jenkins
Let me start with the worst possible future. The worst possible future is that one or two mega–companies based in the United States end up essentially defining the ethics of what AI should do – because if they have the best model and everyone wants to use it, they decide what’s acceptable and how it should behave.
00:19:20:08 – 00:19:24:05
David Karsten
It sounds kind of familiar. I mean, that’s the point – is that where we’re going?
00:19:24:05 – 00:19:41:06
Alex Jenkins
I mean, that’s sort of where we’re at, right?
OpenAI, for instance – I think they do a reasonable job of defining what an AI should and shouldn’t say, but sometimes they get it wrong. And also, the idea that they should be making that decision isn’t a good one.
00:19:41:08 – 00:20:13:07
Alex Jenkins
I’m a big proponent of what we call open–source AI – models that are released for free for people to modify and change.
Worst case: one or two megacorps. Best case: we have dozens of different AI models we can choose from, tailor to our own standards, and where the ethical standards come from our community and our government – essentially allowing Australians to decide what goes into AI here.
00:20:13:09 – 00:20:18:14
Alex Jenkins
So yeah – it’s a future that could go either way.
00:20:18:16 – 00:20:37:23
David Karsten
At a tertiary education level, is the sector looking at option B as a real possibility – not retrofitting, but taking an open–source model and shaping it in a way that suits tertiary education, and the ethics the sector wants to uphold?
00:20:37:23 – 00:20:55:14
Alex Jenkins
They’ve done things like that in New South Wales. They’ve got a chatbot there that’s available to both students and teachers, which – I believe – doesn’t actually use open–source AI in the background. I think it uses proprietary AI.
However, they’ve tuned it to make sure it’s specific to their curriculum and standards, and what it’s acceptable for a student to chat about.
00:20:55:14 – 00:21:24:07
Alex Jenkins
New South Wales is really ahead of the game. They’re building something – Classmate – which is sort of an AI for teachers at the moment, to generate lesson plans and save teachers time.
A large part of that work is ensuring it aligns with curriculum and the ethical standards we want for students.
00:21:24:07 – 00:21:31:05
Alex Jenkins
So there’s a bit of work there to be done. And I think we could move faster in Australia on that stuff, for sure.
00:21:31:08 – 00:21:38:20
Celeste Fourie
With these tech companies and new AI tools emerging, who do you think is most vulnerable?
00:21:38:22 – 00:22:19:02
Alex Jenkins
Again, it really comes down to what our future with AI looks like. Let’s stick with this 2030 horizon.
By then, I think our phones – or maybe our watches – will effectively be personal assistants. The utopian version is that they monitor what we do during the day, book restaurants, book doctor’s appointments, sit down with us when we need to work, understand exactly how we like to learn, and genuinely make our lives easier. That would be fantastic – and we’re not very far away from something like that.
00:22:19:02 – 00:22:48:08
Alex Jenkins
The worst–case scenario is an amplification of the problems we already see with social media. Things like echo chambers, where people are pushed into more extreme viewpoints, and their entire feed becomes populated with content that reinforces those views. That becomes their whole world – and it’s incredibly divisive for society.
00:22:48:10 – 00:23:06:02
Alex Jenkins
We see this particularly with young men, who can get drawn into these spaces. An AI–driven version of that would be horrific. If we’re all consuming AI–generated content, these systems start to resemble platforms like TikTok – they’re almost like poker machines or gambling machines.
00:23:06:02 – 00:23:32:20
Alex Jenkins
You get these little dopamine hits, the content is short, and the algorithms are incredibly good at figuring out what you like. They personalise your feed and keep you engaged. Soon, a lot of that content will be AI–generated as well.
We could end up in this sea of constant consumption, where we’re all glued to our screens purely to stay engaged.
00:23:32:20 – 00:23:35:18
Alex Jenkins
That would be a horrible future.
00:23:35:20 – 00:23:43:22
David Karsten
Alex, you’ve spoken about algorithmic bias as an equity risk – but what about access gaps? Is that also a potential vulnerability?
00:23:43:24 – 00:24:07:01
Alex Jenkins
That’s a really important question. This technology is incredibly powerful, and its cost is dropping – it’s becoming more of a commodity.
But there’s still the fundamental cost of access: devices like phones, tablets, and – critically – reliable internet connections.
00:24:07:01 – 00:24:20:20
Alex Jenkins
As the technology progresses, it’ll become more multimodal. You’ll talk to it, it’ll talk back. It’ll stream video to you, but also see what’s on your screen and what you’re looking at. There are already sunglasses with AI cameras built into them.
00:24:20:22 – 00:24:21:14
David Karsten
So weird.
00:24:21:18 – 00:24:31:05
Alex Jenkins
Do you remember Google Glass? People used to have those camera–enabled glasses ripped off their faces in public.
00:24:31:05 – 00:24:34:02
David Karsten
Oh really? Because they were quite identifiable as well.
00:24:34:02 – 00:24:51:05
Alex Jenkins
People didn’t want to be filmed. That was about 15 years ago.
When Google tried this with Google Glass, the technology wasn’t as advanced, but it was very clearly rejected by society.
00:24:51:05 – 00:25:10:15
Alex Jenkins
I don’t think that rejection would happen now.
So to answer your question more directly: even though the underlying AI is becoming cheap, the hardware around it is still a major issue. We don’t want some kids using AI to support their learning while others don’t have access.
00:25:10:17 – 00:25:31:03
Alex Jenkins
We already see this in education. Independent schools are well ahead in their use of AI. Government schools often need to move more cautiously and in unison.
There’s also a significant disparity between metropolitan Perth and regional areas.
00:25:31:05 – 00:25:33:14
David Karsten
And that’s unfortunately always been the case.
00:25:33:16 – 00:25:39:04
Alex Jenkins
It has – and AI risks reinforcing those same inequalities.
00:25:39:06 – 00:25:50:04
Celeste Fourie
How are Australian schools and universities responding to AI? And how do you see policies and rules evolving over the next decade?
00:25:50:06 – 00:26:07:06
Alex Jenkins
If we go back to when tools like ChatGPT first appeared, the initial response was panic – oh my goodness, what on earth are we going to do about this?
In schools especially, the first reaction was often to ban it. But that’s not a long-term solution.
00:26:07:08 – 00:26:27:00
Alex Jenkins
Then you see a shift towards trying to detect AI-generated work – and again, that’s a difficult and flawed approach.
So the next step becomes rethinking how we assess students’ understanding.
00:26:27:02 – 00:26:54:06
Alex Jenkins
That takes time. You’re talking about overhauling assessment processes that might not have changed in 30 years – and that’s a big deal.
It’s the same challenge in schools and universities. There’s this trajectory from denial, to resistance, to acceptance, and eventually to meaningful integration.
00:26:54:08 – 00:26:55:07
Alex Jenkins
It’ll take a few years to get there.
00:26:55:09 – 00:27:20:11
David Karsten
We’re at a pretty critical point – right at the coalface of a lot of change.
Alex, you’re right there as well. Alongside being Director of the Data Science Innovation Hub, you sit on the Department of Health’s AI Oversight Committee, the Department of Education’s Generative AI Advisory Group, and the Department of the Premier and Cabinet’s AI Advisory Board.
00:27:20:13 – 00:27:48:17
David Karsten
You’ve got this incredible overview of how AI is being integrated across society. So at this point, what would your advice be to high school and university students on using AI responsibly – but also effectively – right now?
00:27:48:19 – 00:28:10:24
Alex Jenkins
My advice is to learn how to use AI in a way that supports your thinking, rather than replaces it.
If you’re working on a problem, say to the AI: I need to understand how to do this. Don’t give me the answer – guide me towards the correct method.
00:28:11:01 – 00:28:29:00
Alex Jenkins
Treat AI more like a tutor than an answer machine.
That relationship matters, and it’s as simple as telling the AI how you want it to behave. It will respond differently – and respect those boundaries.
00:28:29:02 – 00:28:49:02
Alex Jenkins
I’d also strongly advise students that when AI says something, ask it for a source.
These systems still hallucinate – they still make things up, just like people do. And particularly in an academic setting, that’s not acceptable. In science, we don’t accept “well, I reckon one plus one equals three.”
00:28:49:04 – 00:29:08:10
Alex Jenkins
You need to back things up with evidence. Always ask the AI for links, for sources, and for where its claims are coming from – then dig into the primary source yourself.
That’s an important skill in education and learning anyway, and it helps ensure you’re not misled by AI.
00:29:08:10 – 00:29:11:11
Alex Jenkins
So those are my two top tips.
00:29:11:13 – 00:29:19:21
Celeste Fourie
And for educators, how can they build confidence with AI and use it to strengthen their practice?
00:29:19:23 – 00:29:40:14
Alex Jenkins
I think it really comes down to experimenting with AI – seeing what it does well and what it doesn’t. Don’t be afraid of it.
And again, treat AI output as if it’s coming from an untrusted individual. If it makes a claim you’re unsure about, ask where it got that information from.
00:29:40:14 – 00:30:05:07
Alex Jenkins
Educators can really benefit from AI in practical ways. Say you have a lesson plan, but you also have a student with additional needs – maybe someone on the autism spectrum – and they love trains.
You can take that lesson plan and say: let’s make this about trains. Or soccer players. Or whatever their interests are.
00:30:05:09 – 00:30:35:14
Alex Jenkins
The Autism Academy recently won a large grant from the Department of Communities, and I was a partner in that project. We used it to build generative AI activities for neurodiverse students.
We also held a State of AI forum here at Curtin, run by the Fogarty Foundation and Beyond Boundaries Foundation, where we delivered a half-day session showcasing some of these activities.
00:30:35:14 – 00:30:53:14
Alex Jenkins
It was great to see teachers picking this up and learning how to really run with it. There are all kinds of classroom activities you can do – like asking the same question to AI models from different regions, say one from the United States and one from China, and comparing how they respond.
00:30:53:14 – 00:30:54:18
David Karsten
How interesting.
00:30:54:20 – 00:31:02:12
Alex Jenkins
And then asking why they respond differently. These systems are a product of society – they reflect our writing, communication styles and ethical standards.
00:31:02:12 – 00:31:03:24
David Karsten
They bring their own context with them.
00:31:03:24 – 00:31:25:12
Alex Jenkins
Exactly. Just like 20 years ago, when students analysed newspaper columns in high school and looked for bias – what information was included, what was left out – we need to do the same with AI.
AI isn’t magic. It’s a product built within a regional, political and cultural context.
00:31:25:12 – 00:31:41:02
Alex Jenkins
It reflects the standards and values of the communities it comes from, and understanding that is incredibly important.
00:31:41:02 – 00:32:05:04
David Karsten
You mentioned earlier that having major ethical and policy decisions in the hands of tech firms is a less-than-ideal situation.
So who should be responsible for AI ethics, ideally?
00:32:05:06 – 00:32:07:04
Alex Jenkins
That’s such a good question.
00:32:07:04 – 00:32:08:20
David Karsten
Are we too late?
00:32:08:22 – 00:32:27:22
Alex Jenkins
No – we’re not too late.
I think we need to approach AI ethics the same way we approach ethics in journalism. I get asked all the time: how do we remove bias from AI? And my response is: how do we remove bias from a newspaper?
00:32:27:24 – 00:32:57:14
Alex Jenkins
There isn’t a single newspaper in the world that everyone agrees is unbiased. Everyone brings their own lens and perspective.
The world is never going to agree on one universal definition of what is true or right.
00:32:57:16 – 00:33:17:11
Alex Jenkins
Newspapers are actually a great analogy for AI. They’re collections of content, shaped by editorial decisions – not just about how things are written, but what information is included, and what’s left out.
00:33:17:13 – 00:33:44:15
Alex Jenkins
We don’t try to create one perfect newspaper that tells the absolute truth. Instead, we have a diversity of outlets across the political spectrum – left, right, progressive, conservative.
I think we need to do the same with artificial intelligence.
00:33:44:16 – 00:34:13:24
Alex Jenkins
Rather than one single AI system, we should have many – including open-source models that organisations can adapt to their own ethical frameworks, whether that’s education, government or industry.
That diversity is the best outcome we can hope for, because the world will never agree on a single definition of truth – and AI is no different.
00:34:14:01 – 00:34:14:18
Alex Jenkins
Yeah.
00:34:14:20 – 00:34:23:01
David Karsten
Do you see Australia maybe charting its own course in terms of AI policy in the future, or is that a global thing?
00:34:23:01 – 00:34:46:12
Alex Jenkins
So the government has just released a national AI plan where they’ve said they’re not going to move forward with stringent guardrails. I think that’s probably the right approach at this point in time.
I think it’s better to acknowledge that there will be risks with AI, but that many of those risks can be managed through existing legal frameworks.
00:34:46:14 – 00:35:13:17
Alex Jenkins
The risks are also going to be different depending on the sector – whether that’s automotive, pharmaceuticals or education.
So I think we’re best off leaving those specific risks and issues to the relevant sectors as they arise. A lot of what goes wrong with AI can already be handled using existing laws.
00:35:13:17 – 00:35:34:21
Alex Jenkins
My fear is that if we had a blanket AI policy, it would go out of date within three weeks of being published.
So I’m glad the federal government has taken this approach. What the future holds, I’m not sure – as AI models become more capable, that might change.
00:35:34:21 – 00:35:54:23
Alex Jenkins
There may be something there around ensuring equitable access. It’s feasible that one day we could see AI as fundamental as access to water, power or telecommunications.
Not many people could survive in the modern world without access to the internet or a smartphone.
00:35:55:00 – 00:35:59:12
Alex Jenkins
So I can see AI becoming part of that progression.
00:35:59:14 – 00:36:15:24
David Karsten
Alex, this has been fascinating.
For you – having been so close to this for more than a decade, watching it unfold so rapidly and at such scale – how exciting has your career been over the last little while?
00:36:16:00 – 00:36:46:18
Alex Jenkins
It’s been a whirlwind.
The most exciting thing for me is being able to work across different industries. I really get a buzz from talking to people who aren’t necessarily technical. You see it a lot in health and medicine – people describe a problem, and it turns out it can be approached using artificial intelligence.
00:36:46:23 – 00:37:06:09
Alex Jenkins
One of the most famous and arguably most useful AI systems in the world is AlphaFold, which solves a fundamental biomedical research problem – protein folding.
The inventors of that were awarded a Nobel Prize. Seeing how this technology can be applied across industries is incredibly exciting for me.
00:37:06:09 – 00:37:10:00
Alex Jenkins
That’s my job – educating different industries. One day I’m talking to a bank, the next to a hospital, then to schools. It’s very fulfilling. I’m very lucky.
00:37:10:02 – 00:37:16:24
David Karsten
And in doing so, there are now many more people than just Celeste and I who can confirm you are a real person.
Thank you very much.
00:37:16:24 – 00:37:20:12
Alex Jenkins
So I don’t have an AI version of myself? Oh, yeah.
00:37:20:14 – 00:37:25:19
David Karsten
Yeah, yeah.
But we’ve really appreciated your time. Thanks so much for talking to us.
00:37:25:19 – 00:37:28:03
Alex Jenkins
Thank you – thanks very much.
00:37:28:05 – 00:37:43:14
David Karsten
You’ve been listening to The Future Of, a podcast recorded on Whadjuk Noongar Country and powered by Curtin University. If you enjoyed this episode, please share it. And if you want to hear from more experts, stay up to date by following us on your favourite podcast app. Bye for now.