The Future Of

Personal Data Privacy Online | Dr Adam Andreotta

Episode Summary

Data is the currency of the digital age, and most of us give ours away without realising. Discover how big data is reshaping our privacy online and what this means for our ultra-connected future.

Episode Notes

Data is the currency of the digital age, and most of us give our personal data away without realising. In this episode, host David Karsten is joined by Dr. Adam Andreotta to discuss his book, Rethinking Informed Consent in the Big Data Age

Listen to discover how big data is reshaping our privacy and how we can protect our information in the new age of integrated technology.

Learn more

Connect with our guests

Dr Adam Andreotta, Lecturer, School of Management and Marketing, Curtin University

Dr Adam Andreotta's research explores the philosophy of artificial intelligence, the ethics of big data, and how we can better secure informed consent surrounding the collection and use of our personal data online.

In 2024, Adam authored Rethinking Informed Consent in the Big Data Age where he delves into the challenges of self-managing our private data online and suggests ways we can improve data consent practices in everyday contexts.

Join Curtin University

This podcast is brought to you by Curtin University. Curtin is a global university known for its commitment to making positive change happen through high-impact research, strong industry partnerships and practical teaching.

Transcript

Read the transcript

Behind the scenes

Host: David Karsten

Recordist and Content Creator: Caitlin Crowley

Social Media: Celeste Fourie

Producer: Emilia Jolakoska

Executive Producer: Anita Shore

First Nations Acknowledgement

Curtin University acknowledges all First Nations of this place we call Australia and the First Nations peoples connected with our global campuses. We are committed to working in partnership with all Custodians and Owners to strengthen and embed First Nations’ voices and perspectives in our decision-making, now and into the future.

Episode Transcription

00:00:00:03 - 00:00:09:04

Sarah Taillier

This is The Future of, where experts share their vision of the future, and how their work is helping shape it for the better.

 

00:00:09:06 - 00:00:35:09

David Karsten

I'm David Karsten. Data is the currency of the digital age, and most of us give it away without a second thought. Each time we click, I agree to privacy policies online. Those clicks have consequences. They shape how our personal information is collected, shared, and sometimes even exploited. Today I was joined by Doctor Adam Andreotta from Curtin University to talk about his book, Rethinking Informed Consent in the Big Data Age.

 

00:00:35:11 - 00:00:58:23

David Karsten

We explored how big data is reshaping our privacy, investing that the failures of online consent today, and looked ahead to how we might build more effective systems for the future. If you'd like to find out more about this work, you can visit the links provided in the show notes. You know, I was always sort of thinking as I as I prepared for this, you know, I thought, well, speaking of personal data, yeah, I'm going to have a look at a bit of yours, Adam.

 

00:00:58:23 - 00:01:17:16

David Karsten

Okay. Okay. So, yeah, I just I just had a look at your path to to where you are today. Well, yeah. As much as is revealed on, online. And, and I find it very interesting, the fact that you, you completed an undergrad in computer science, which sort of took care of the, of the what am I. Yeah.

 

00:01:17:17 - 00:01:27:03

David Karsten

But then. Yeah. Did you get to a crossroads and and then decide to look at the ethics of, of the. How could you explain a little bit about, about your path.

 

00:01:27:03 - 00:01:42:12

Dr Adam Andreotta

Yeah. So my first degree was in computer science and as I was doing programing I got really interested in say how this relates to people. So for instance, when you're making a program, you can get it to do exactly what you want. Via these if then statements, you know, if this happens, this input do this kind of thing.

 

00:01:42:14 - 00:01:57:07

Dr Adam Andreotta

And it seems to me that this is kind of very similar to human psychology. You know, often we react this way to certain inputs. You know, I think about people are aggressive when people are aggressive to them. And I kind of got it thinking about how this relates to, say, free will and determinism. I thought, hey, are we really free?

 

00:01:57:07 - 00:02:08:15

Dr Adam Andreotta

Maybe we were just really advanced, sophisticated programs. And I said, that's a little bit too philosophical for the computer science department. Why don't you go down the road there to the philosophy department and also sort of questions.

 

00:02:08:17 - 00:02:18:14

David Karsten

Well, so so you go to, to study your PhD or what what was, you know, if you could summarize it in a very simple way for us, your thesis.

 

00:02:18:16 - 00:02:35:19

Dr Adam Andreotta

Well, the thesis is actually, you know, before getting into the, the whole AI ethics of it, I wanted to really go back and think about how we know things. So it was really on self-knowledge and how we know our own minds. And really from that starting point, then I wanted to go and get some of the other big questions about concerns and everything like that.

 

00:02:35:19 - 00:02:54:00

Dr Adam Andreotta

But I was almost starting from the, you know, like, where does Descartes start from the cogito ergo, you know. Wow. So, you know, your own kind of thoughts and, I suppose, for me, that was kind of like the foundational kind of work that I wanted to do. And then from that, then I, I felt like that was what I really needed to have a bit of a think about, how do we know our own beliefs?

 

00:02:54:00 - 00:03:10:02

Dr Adam Andreotta

How do we know our own minds? What is special about human minds? And that seemed to be a question that you had to ask in order to do ethics, in order to do the work and the philosophy of mind work and all these other kinds of fields always seem to go back to self-knowledge. So you know how we know things in general.

 

00:03:10:08 - 00:03:26:06

Dr Adam Andreotta

What is, more familiar? Our own minds? I think so it was all about the epistemology, how we know things. And yeah, that was, that was the next step after computer science. And I guess, like after that, I kind of tried to bring those two together, the technology and the, the philosophy that was that's kind of what I do now.

 

00:03:26:08 - 00:03:35:15

David Karsten

Well, so the, the, the question of whether or not we are actually just programs. Yeah. Sounds like the plotline of The Matrix. Are you neo? Yeah.

 

00:03:35:18 - 00:03:49:18

Dr Adam Andreotta

Yeah. Well, there's lots of interesting science fiction films about that kind of thing, and that was kind of what I was really interested in. I suppose around that time, The Matrix had just come out and, this is like in the mid 2000. So a little bit after that. But I was really interested in those questions. And you know what?

 

00:03:49:18 - 00:04:04:10

Dr Adam Andreotta

If we're just programs, what does that say about our responsibility? You know, maybe none of us can ever be praised or blamed for different actions we have kind of thing. And, the, you know, people in computer science said, look, they're really interesting questions, but just stick to the programing. You know, this is where the syntax goes along the language.

 

00:04:04:10 - 00:04:21:18

Dr Adam Andreotta

And, and there's this sort of not very practical questions. What you should do is, you know, try to figure out how the programing works and then, you know, you get hired at this company. But I kind of always had those really interesting questions in the back of my mind that I was, I kept getting distracted by, eventually those sort of took over.

 

00:04:21:18 - 00:04:36:10

Dr Adam Andreotta

And, you know, I just, I find that if you're interested in things, you spend more time wanting to learn them, and, and then eventually the C plus, plus all the programing became a little bit dry and didn't kind of excite me in the same way. Once I sort of saying that there are people that work about this.

 

00:04:36:10 - 00:04:40:23

Dr Adam Andreotta

So it was around that time that I kind of, switched a little bit. I feel like.

 

00:04:41:00 - 00:04:58:08

David Karsten

Well, look, I think it's a very worthy distraction and one that, Yeah, there are lifetimes of work ahead of you, really, in this area. But look, before we we go into a deeper exploration of your your actual work as it stands today. Can you just define for us what personal data is? I mean, you talk about informed consent.

 

00:04:58:09 - 00:05:01:11

David Karsten

Yeah. And that's to do with personal data. Can you define what that is for us.

 

00:05:01:12 - 00:05:13:11

Dr Adam Andreotta

Yeah. So personal data is really anything that relates to an identifiable or an identified natural person. This is a definition that's taken up by the GDPR when we usually talking.

 

00:05:13:11 - 00:05:14:13

David Karsten

About the GDPR, the.

 

00:05:14:13 - 00:05:33:04

Dr Adam Andreotta

General Data Protection Regulation, all those guys, this guys, this is a really impressive new piece of legislation in the last few years from, Europe, from the EU countries. And it's really sort of set the pace in terms of data protection. Sort of it's, you know, the leading kind of legislation that the envy of Australia and I suppose the United States.

 

00:05:33:06 - 00:05:56:10

Dr Adam Andreotta

But when we're talking about that, we're talking about anything that can potentially identify you. So the IP address, your emails, photo of you likes, browsing data, purchase history and the list goes on and on and on. It really anything that can be tied back to you and reveal something about you, and you can imagine the kind of thing that would come from being on the internet using a smartphone.

 

00:05:56:12 - 00:06:00:10

Dr Adam Andreotta

Anything that kind of contributes to your digital footprint would be part of that.

 

00:06:00:12 - 00:06:15:03

David Karsten

And let's face it, our footprint in 2025 is massive. Yeah. As individuals, isn't it? Is. There's just no getting away from it. Yeah. Why is that a problem? Yeah. Why is giving up our data a problem?

 

00:06:15:05 - 00:06:34:20

Dr Adam Andreotta

Well, it need not necessarily be a problem, but I think many people are becoming increasingly alarmed because of the importance of privacy. So when we want to take a step back and say, why is privacy important? And if we think about why privacy is important, it limits others to have power over us. When we give away our privacy, we open ourselves up to manipulation.

 

00:06:34:22 - 00:07:04:08

Dr Adam Andreotta

We can maybe be harmed. You know, we find privacy is important for trust, for our own autonomy. We don't want to give away too much information. And so think about why privacy is important in general. And then I think you have an answer to the question about why the collection of all this data is becoming so, so much of a problem because every new device you use, every new app uses collecting information about you and in the wrong hands or the wrong kind of data can undermine, you know, your your human rights, or maybe even some of your own desires.

 

00:07:04:08 - 00:07:22:10

Dr Adam Andreotta

You may not just want certain bits of information to get out there. It allows you to control, maybe the way that people perceive you. There's certain things you only want your family members to know, you know, your reputation management. And so when all that information gets out there, we lose a bit of control over that process.

 

00:07:22:10 - 00:07:24:13

Dr Adam Andreotta

But it's just going to be different for each and every one of us.

 

00:07:24:18 - 00:07:44:00

David Karsten

I suppose any individual you speak to, Adam would argue that, well, our data is ours, but there comes a point in terms of, of, when we talk about informed consent or uninformed consent that we, we almost give up the right to ownership of that data exclusively. Is that the case? And yeah.

 

00:07:44:00 - 00:07:59:14

Dr Adam Andreotta

And this is really what, you know, it's a central point of the book. It's not so much that there's anything wrong with giving away that data, but it's about being in control of the day that it's kind of given away. So consent is supposed to be the mechanism by which we give it away. We agree to it or we we don't agree with that.

 

00:07:59:14 - 00:08:15:19

Dr Adam Andreotta

And the informed is that we sort of know what's happening with it. But that's just so hard to do nowadays because there's so many different types of information that's been collected about us, and it's almost impossible to kind of get a hold over that and control what's actually happening in that process, which is, I think, alarming to many people.

 

00:08:15:19 - 00:08:21:20

David Karsten

Well, what are the good things about the sharing of, of of your own personal data with big data?

 

00:08:21:20 - 00:08:42:16

Dr Adam Andreotta

Yeah. I mean, there's lots I mean, if you think about, say, in driverless cars, when I think about the data that's required. So it needs a lot of data to be trained upon. That could be personal data. You know, how we drive on the road, think about advancements in healthcare. Lots of data from lots of patients. And maybe eventually I can get very good at recognizing patterns that a human just wouldn't have the time to do.

 

00:08:42:18 - 00:09:01:05

Dr Adam Andreotta

And it's really beneficial. Even entertainment. Netflix, for instance, you know, you save lots of time by getting all your favorite shows presented to you. That requires personal data. It requires the shows you like, you know, did you spend the whole time watching it? Did you start watching after two minutes? What other shows have you liked? Did you give it a good thumbs up at the end?

 

00:09:01:06 - 00:09:17:03

Dr Adam Andreotta

All those kinds of things. And from that personal data, it can give you a recommendation about what you might enjoy next. Without personal data, we couldn't do any of that. And I think most people, lack those kinds of things. Right. So there's lots of benefits from having all of that data. We couldn't do it without the data.

 

00:09:17:07 - 00:09:32:01

David Karsten

And yet with the benefits for the very same, a slab of data that you, you choose to share, there are also flip sides. Let's take a, DNA engineer, for example. You could unwittingly help incriminate a family member.

 

00:09:32:03 - 00:09:49:17

Dr Adam Andreotta

Yeah. So this is where it gets really tricky, right? Because we might say that, interesting. With those cases, 23 in May and there's been some high profile cases where police have actually worked out, who actually murdered somebody, not by getting the DNA of the individual, but by a related family member. And that seems like a good result.

 

00:09:49:17 - 00:10:16:01

Dr Adam Andreotta

But in those kinds of cases, there are lots of people that didn't consent to giving their information away. You might give it away and that might actually reveal information about someone very close to you. So suddenly it's almost like you're consenting on behalf of other family members. And we might say that's okay. In cases of, say, crimes that are solved, but it gets into a really gray area because now lots of other people, we can find out information about them, and they haven't even been asked about it.

 

00:10:16:01 - 00:10:29:22

Dr Adam Andreotta

They haven't given that consent. And that kind of seems pretty problematic, even if it can maybe in the end lead to crimes being solved. It's maybe an invasion of other people's privacy and that gets into really difficult kind of gray area.

 

00:10:29:24 - 00:10:55:05

David Karsten

So gray, so gray. Adam. But we talk about informed consent as something, that. Well, well, the onus is on us to, to give that consent. Yet the mechanisms by which we do this, reading privacy policies, cookie preferences, that that responsibility, that responsibility on us as individuals is, is actually quite a commitment. Some of these documents, a massive.

 

00:10:55:05 - 00:10:59:20

David Karsten

Yeah. Can you, can you walk us through the current state of privacy self-management.

 

00:10:59:22 - 00:11:15:19

Dr Adam Andreotta

Which it's really difficult. The other day, I saw a privacy policy that was about 100,000 words, something. So, you know, you can imagine how long it would take to get to that. The reality is that most of us don't read all the privacy policies that were presented, and obviously for some pretty good reasons. Right? They were just more and more of them coming about.

 

00:11:15:21 - 00:11:34:02

Dr Adam Andreotta

For every new app you download, you've got a Fitbit, you've got like even cars now have terms and conditions associated with the data they collect. You know, your Tesla might be, collecting data about where you've gone to train that autonomous, vehicle. But there's more and more devices, more and more privacy policies, and it's simply just impossible to go through that.

 

00:11:34:02 - 00:11:56:00

Dr Adam Andreotta

So the reality is that most of the time, in order to use these, applications or services, we just click accept. So what we're doing is giving an informed consent. And the problem with that is we don't really know what we're giving consent to a lot of the time. And sometimes that's fine. But other times it might be that we're giving consent for our information to be used in ways that don't align with our preferences, our values.

 

00:11:56:02 - 00:11:58:00

Dr Adam Andreotta

And that that's the concern. Right?

 

00:11:58:02 - 00:12:05:11

David Karsten

Well, yeah. I mean, when you put it in those terms, the reality is most of our consent is uninformed if we're being real.

 

00:12:05:13 - 00:12:20:00

Dr Adam Andreotta

Yeah, I think that's right. And that's really the problem. It's just not possible to do it. And given that we do use these applications, you know, we may have to press consent. Yeah. You know, sometimes you actually give consent, but just by being on the website or on the service, so often we don't even know that we've given consent.

 

00:12:20:04 - 00:12:43:12

David Karsten

You know, I'm not trying to sound alarmist here, but can you, can you walk us through what some of the, the potential outcomes could be when we, we talk about particularly, well, in the brief it's captured at the transparency problem, which we've kind of touched on here, but but also, repurposed data, that that's a real issue and it might not manifest for, for months.

 

00:12:43:12 - 00:12:59:19

Dr Adam Andreotta

Yeah, exactly. And, you know, with repurposed data, I mean, what I do in the book is try to look at the comparisons between medical consent. And so the reason that transparency is so important in the medical field, is because you want to know what that's going to happen in the, the actual, procedure. Right. What are the risks?

 

00:12:59:21 - 00:13:25:17

Dr Adam Andreotta

What are the likelihood of success, what's going to happen in the long term kind of thing? But once the procedure is done, that sort of it, you've had your heart procedure or you've had your, you know, something to do with, your leg of something and then it's done. But with data, it's really difficult because even after you've given consent for it to be used today, that same data could be used in a different way for another company next week or in five years, or combined with other data in a few more years time.

 

00:13:25:19 - 00:13:43:22

Dr Adam Andreotta

And it doesn't really become a risk until it is combined with that such data. Or in the case of, Fitbit. So for instance, Google acquired that at some point and their privacy policies might have been fine for you and you only given your data, but then another company comes along, buys the company, and maybe they now have a different privacy policy.

 

00:13:43:24 - 00:14:02:12

Dr Adam Andreotta

Which maybe doesn't hear with your values. So this data can just kind of sit there for a long period of time and maybe a year or 2 or 3 later on when the policy changes or, the conditions change, suddenly a new risk enters the scene. So it's almost like this. It's almost like you have to give continued consent.

 

00:14:02:14 - 00:14:21:03

Dr Adam Andreotta

And that's a very strange way of giving consent. Usually we give consent for something to occur, but online we have to sort of give consent for who knows, who knows how long that's going to be used for. Yeah. My other companies want to eventually buy it. New companies come on to the scene all the time and buy existing data for purposes that we can't even imagine.

 

00:14:21:07 - 00:14:25:06

Dr Adam Andreotta

And so that's it's really difficult to think about the lifespan of the data.

 

00:14:25:08 - 00:14:47:01

David Karsten

Well, when you think the data is, is actually the commodity to be chasing. Now, that is where the real wealth lies. It's it's, you almost have to look at these, these privacy policies. Yeah. With a with a slightly cynical view. Yeah. That that your data is there to be, I guess exploited is a really strong word, but.

 

00:14:47:01 - 00:15:04:19

Dr Adam Andreotta

Yes. And and changed as well. I mean, in the book, I give an example of, x formerly Twitter that, one day decided to add biometric data into its privacy policy. So now exactly how that's going to work. It's not clear. But, you know, if you look at the privacy policy three years ago, that's probably not going to still be valid to today.

 

00:15:04:19 - 00:15:21:22

Dr Adam Andreotta

So the privacy policy is can change the types of data that are collected can change the use. This can change the companies that are interacting with this data can change. Very overwhelming for people that are trying to you know, even if you could spend, a few hours going through the policy, how long is that going to be valid for?

 

00:15:21:24 - 00:15:29:10

Dr Adam Andreotta

When might at change next? Are they being transparent? And in some cases, the answer is no.

 

00:15:29:12 - 00:15:36:01

David Karsten

I'll look at some of these policies. And and, Adam, it's one thing to rate them and having the time to read them. Yeah, but how many of us would actually understand it?

 

00:15:36:03 - 00:16:02:10

Dr Adam Andreotta

Yeah. And the problem is kind of exacerbated by the fact that there's lots of opacity in them as well. So the companies are not often motivated to make them clear. In the book, I give a few examples of some work that's been done on fertility apps. A recent study by an academic from NSW, and they found that some of the, most, the world's leading fertility apps were very misleading in the way that the language was couched and about the out terms, and there were conflicting messages in them.

 

00:16:02:16 - 00:16:20:01

Dr Adam Andreotta

So if you read these things and are confused about them, or you find that they're, different messages in different parts, it's really hard to know what's actually going to happen or what's what's to be expected, or and it just leaves you in a very kind of vulnerable position. You're very feel very, you don't feel very empowered, I suppose.

 

00:16:20:07 - 00:16:23:13

David Karsten

Especially with such a major set of decisions that you're about to make.

 

00:16:23:13 - 00:16:41:02

Dr Adam Andreotta

I mean, that's yeah. Yeah, that's and it's, and it's one thing to say, you know, my, my Netflix locks are going to get, collected, but there's everything else that we do online. You know, we spend almost our entire lives online that it's become, you know, there's so much information that's been collected. And often it's very, very personal and it's nature.

 

00:16:41:04 - 00:17:06:03

David Karsten

Adam. It just seems like a, an insurmountable, obstacle this, this, this, issue, this conundrum of of being, I guess, a participant in the digital world. Yeah. Whilst retaining some sort of modicum of privacy with, with a genuine option to actually go completely off grid. I mean, is that is that even realistic or is there another way to approach this?

 

00:17:06:07 - 00:17:23:00

Dr Adam Andreotta

I guess it is one option. One one thing to say about that is just how hard it actually is. I mean, even coming to Curtin University, you have to use a parking app a lot of the time to get here, and even the parking app will collect lots of bits of personal information about your number plate, maybe the amount of times where you parked, location that you parked.

 

00:17:23:02 - 00:17:41:11

Dr Adam Andreotta

No question about where that information goes. How well is that stored? Because of course, if it's not well stored or there are risks of it being hacked or something, then that is, you know, you wouldn't want to store your money in a bank that has known privacy. Flaws. Right? Or, you know, a vulnerable vulnerabilities. So we place a lot of trust in these institutions.

 

00:17:41:11 - 00:17:58:22

Dr Adam Andreotta

And how well do we know even the parking at a Curtin? You can take things offline, but often there's this fear of missing out. If you're a business owner on Facebook or something, you have to maybe pay a price for that. A lot of people talk online now through messaging systems if you don't want to be on those, so you can do it.

 

00:17:58:22 - 00:18:07:11

Dr Adam Andreotta

But there's often a price to pay that I think most people very relevant are hesitant to do that. Rather, just because we do spend so much of our lives online. Right.

 

00:18:07:17 - 00:18:14:01

David Karsten

Well, what do you suggest, then, Adam? Like I said, how do how do we how do we how do we tread carefully what's difficult?

 

00:18:14:04 - 00:18:31:00

Dr Adam Andreotta

So some authors actually suggest that the problem is so bad that we have to sort of give up on this idea of, us taking agency and what really we should be doing is pushing for legislation. But the approach I take in the book is, no, I think there is a place to do it. It's not something that we can kind of can play all on ourselves.

 

00:18:31:00 - 00:18:44:11

Dr Adam Andreotta

But I think that if companies were encouraged to try to help us, through this process, maybe we can talk about some of the different approaches that I suggest to, we might actually have a chance of managing, some of our privacy here.

 

00:18:44:13 - 00:19:05:12

David Karsten

Well, look, firstly, may I ask, Adam, is it ever an option? You know, when you're going through a privacy policy and you need to accept these conditions, can you can you go back to that provider and say, look, I largely agree, but look 0.235 and .752. Yeah I've got an issue with those. I will agree, subject to those two being left out.

 

00:19:05:12 - 00:19:24:22

Dr Adam Andreotta

It's I mean I can't really think of any I'm sure they probably awesome. Like that's very, very rare a lot of the time. And so take it or leave it kind of thing. And what I do in the book is try to look, look, look for inspiration in the medical field. And one of the things that medical practitioners will say is when we're doing procedures, we're going to be transparent, but we also have to offer patients alternatives.

 

00:19:24:24 - 00:19:40:19

Dr Adam Andreotta

You have to say, well, you can do this, this procedure has these risks or, there's this one if you don't want invasive surgery. Went this one over here. And then the doctor's supposed to have a conversation with you and figure out what's best for you. But with a lot of these online, systems, you can either accept it or not.

 

00:19:40:19 - 00:20:00:16

Dr Adam Andreotta

And in a way, some of them are quite coercive in that way. Right. Because if you don't use them, then you miss out on all the benefits. And you really users are fairly powerless in terms of negotiating. You might find the odd one where they can. I mean, some of them provide you kind of lists of options you can select, but for the most part, individuals don't have much power with respect to negotiating.

 

00:20:00:16 - 00:20:17:24

Dr Adam Andreotta

I mean, what's like a large company like Spotify going to say, you know, I'm not comfortable with section 4.5. They're going to say, don't use the platform. Then there's millions of other people that will do it. And probably because they don't suffer any financial penalties from that, that sounds of like lots of people not buying. It's probably not really.

 

00:20:18:01 - 00:20:19:24

Dr Adam Andreotta

They don't probably take much of an interest in it.

 

00:20:20:01 - 00:20:28:05

David Karsten

Okay. Well, and in the absence of that as an option, Adam, that's a terrible idea from Big Dive here, but, Yeah, but what can we do?

 

00:20:28:05 - 00:20:47:19

Dr Adam Andreotta

Well, it has merit. It has merit. Let's say it's part of a larger, set of ideas. Yeah, but I think, you know, obviously the law has to play an important role. And I mentioned the GDPR before in Europe, the General Data Protection Regulation. And this requires companies to actually get informed consent from people, which is I think is a positive, sign.

 

00:20:47:21 - 00:20:55:13

Dr Adam Andreotta

And if there is any misleading contracts or those kinds of things, penalties can be issued to companies. So I think that's a really good starting point.

 

00:20:55:13 - 00:21:04:18

David Karsten

Before you go on, Adam, how does that actually manifest? I mean, if we were to look at a privacy, policy in Europe, how would it differ? What would it look like?

 

00:21:04:20 - 00:21:28:14

Dr Adam Andreotta

A good example is actually from Australia, where Facebook were accused of harvesting, profiles to get pictures. And I suppose I was selling them to different people. They were interested in Europe. They won a lot to do that. They had to offer opt out options because the GDPR set up such that if people want to opt out of things, they have the option to do that in Australia, since we don't have that legislation, Australians won't offer that.

 

00:21:28:14 - 00:21:40:23

Dr Adam Andreotta

When they were asked about whether Australians would get it. I don't think Facebook or Meta had much of a comment, so that's like a practical example of how the legislation can actually make a difference. And it doesn't sound like a big deal. It's an opt out option.

 

00:21:40:23 - 00:21:44:17

David Karsten

That is actually a big deal, though, isn't it, Adam? It's massive. It gives us choice.

 

00:21:44:19 - 00:22:00:04

Dr Adam Andreotta

Which gives us choice. Yeah. So I mean, you might say it's only a small thing, but I agree it gives us a lot of choice. And at least it leaves consumers with the capacity to send a message to the company. Because if a large amount of people are opting out, then that could actually send quite a powerful message to companies.

 

00:22:00:06 - 00:22:24:03

Dr Adam Andreotta

The fact that Australians are deprived of that choice because of the legislation points to a need to to change that legislation. But of course, legislation can't cover everything. Still want to empower people and make them, informed about decisions and the fact that, you know, imagine if we had to get consent from people, and people still had to read millions of words and privacy policies, you know, even if they were adhering to legal norms, that would still be a problem, right?

 

00:22:24:03 - 00:22:26:02

Dr Adam Andreotta

Because you would still have to read all these things.

 

00:22:26:04 - 00:22:37:10

David Karsten

I, I interrupted you, and, you were on a bullet train of thought just just before, what were some of the other, I guess, measures that we can take? Yeah, in a practical sense, yeah.

 

00:22:37:14 - 00:23:04:01

Dr Adam Andreotta

Well, one of the ideas and maybe I can discuss the three central ideas. From the book. One of them has to do with ethics review. So if you've ever worked at a university before, before dealing with, before going out and doing a study involving human participants, whether you're doing a survey or, you know, asking people about how onboarding processes work, you have to do an ethics review, whether you've understood the risks, how you're going to be collected, the data, where it's going to be stored, all these kinds of things like this.

 

00:23:04:01 - 00:23:27:22

Dr Adam Andreotta

And, a couple of years ago, a colleagues, few colleagues and myself looked at this idea and thought, what if you could look, look at this idea and do something similar? And the big data kind of space? And so you would have a policy, let's say, by Facebook or Spotify or something, and an ethicists or a team of ethicists, just like a team of people look at the university ones would actually have a look at it and say, this is very misleading.

 

00:23:27:24 - 00:23:44:02

Dr Adam Andreotta

What you put over here confuses the average person. Make it clearer to them. This is a bit legalese. Change it over there. And at least that would get rid of some of the deception that is, I guess, rife in a lot of this kind of work, so that that would at least make it perhaps achievable to a certain, a certain extent.

 

00:23:44:02 - 00:24:05:12

Dr Adam Andreotta

That would be one thing to say. You would hopefully get rid of deception, in those kinds of cases. One example from the book I look at involves, meta. Yeah. Involving a VPN, service. They had a, and this particular VPN was advertised to consumers as protecting their data, what it was actually doing. Yes, it was protecting their data.

 

00:24:05:14 - 00:24:32:04

Dr Adam Andreotta

But from criminals, it wasn't protecting data giving going to Facebook. And a lot of people signed up for this, VPN on the assumption that they wouldn't be giving all their, their information to Facebook about what sites they used and, you know, everything else like that. So that was a little bit, maybe not sneaky, but it was maybe not the kind of thing that if it was given an ethics approval, an ethicist might say, or a team have had this to say, look, this is this is what people would assume that are using a VPN.

 

00:24:32:04 - 00:24:32:16

David Karsten

Exactly.

 

00:24:32:17 - 00:24:52:03

Dr Adam Andreotta

And by, you know, including that that's maybe not lying, but it's a little bit misleading. And so make sure to put that. And if you look through these privacy policy, there's lots and lots of examples like that where you know, some some okay. You'll find a flat out lying. But often the opt out button's hidden or it's very difficult to navigate or you know, things are very confusing.

 

00:24:52:03 - 00:25:05:02

Dr Adam Andreotta

And I think that done so deliberately so that people are actually going to continue to go. So I think if you had some sort of oversight, a review of that, how exactly that would actually be questioned, but you would at least get rid of some of those more nefarious kind of elements.

 

00:25:05:04 - 00:25:08:13

David Karsten

So pillar one, yeah, get the ethicists involved. Pillar two is.

 

00:25:08:14 - 00:25:34:06

Dr Adam Andreotta

Pillar to shorten some of these privacy policies. That that's a big problem at the moment. We cannot read them. So some of the work that I've done has looked at, getting rid of some of the text with and replacing them with pictures. And this has been really cool because, a couple of examples from actually involving Bankwest, they were actually able to change some of their products and replace text with pictures of people using their, products.

 

00:25:34:08 - 00:25:45:09

Dr Adam Andreotta

And they found that people were able to actually understand what the products were much easier with a little comic there as well. It's also used in an onboarding program when people come into a new business, and lots of other really interesting cases, too.

 

00:25:45:12 - 00:25:47:01

David Karsten

It's far more demonstrative, really.

 

00:25:47:03 - 00:26:04:08

Dr Adam Andreotta

Far more demonstrative. Yeah. In fact, actually used in some countries where, literacy levels are quite low and people can still sign employment contracts by looking at the pictures of what's actually happening. Literacy levels might be high in the Western world, but when it comes to this stuff, we're often not able to understand any of it.

 

00:26:04:09 - 00:26:10:17

David Karsten

Well, unless you unless you hold a Bachelor of Laws, it's, you know, it's unlikely that you can actually interpret what's been written.

 

00:26:10:19 - 00:26:29:12

Dr Adam Andreotta

Yeah. So why not replace some of these more difficult things with pictures. And, you know, you get the the word count down. This is new and emerging research. So I guess you go to test it, but it seems like people are much more engaged. It's hard to get engaged with millions of words in private policies. And it looks like pictures also help you understand things as well.

 

00:26:29:14 - 00:26:33:01

Dr Adam Andreotta

This has been used in learning, and lots of different fields.

 

00:26:33:01 - 00:26:35:10

David Karsten

What, for time immemorial time.

 

00:26:35:10 - 00:26:35:18

Dr Adam Andreotta

But I.

 

00:26:35:18 - 00:26:37:18

David Karsten

Really, really like the cave drawings.

 

00:26:37:19 - 00:26:59:21

Dr Adam Andreotta

Cave drawings? Yeah. Oh, you want to learn science in maths? Visual illustrations can be really good. So why not convey some of the really hard concepts with pictures? And that would be one way to get the words down, make sure engagement is up. And then also at least, you know, make these things actually understandable. So again some of this is has to be tested.

 

00:26:59:23 - 00:27:09:04

Dr Adam Andreotta

But this would go, some way to solving some of those kind of problems. And in the book, I give a couple of examples of these that I sort of came up with myself. But, so I think that's promising.

 

00:27:09:10 - 00:27:19:15

David Karsten

And that makes so much sense. Okay. So we've we've got the ethicists involved with shortening these privacy policies and using visual aids, I suppose, as, as, as an element of that. What's the third pillar there? Adam.

 

00:27:19:17 - 00:27:44:14

Dr Adam Andreotta

A third pillar is really interesting, new technology that's, kind of on the horizon, but, hopefully here very soon. And that's this idea of automated consent. The idea is that we would be able to trust agents, AI agents, potentially in the most advanced form, to provide consent on our behalf. So rather than having to go through all of these millions of words of code, we would sort of define our preferences, today's AI agents and we would define them.

 

00:27:44:19 - 00:28:11:21

Dr Adam Andreotta

They would be agents, and they would kind of oversee all this stuff on our behalf. So if we're not comfortable with sharing our IP address or, being tracked across websites when we're browsing, then we tell our AI agent that we don't want our Netflix data being sold to somebody else. And insofar as we're allowed to make that decision, which is hopefully the ethicists ensure that these privacy policy have, our decisions or where will be enacted and this would save us having to go through thousands and thousands of policies.

 

00:28:11:21 - 00:28:32:11

Dr Adam Andreotta

We would trust it with this AI agent. And hopefully our preferences would be able to be consistently applied to the thousands of different applications, websites, apps that we use. And we can maybe, trust the fact that our data would not be taken in a way that is against our values or preferences.

 

00:28:32:13 - 00:29:00:02

David Karsten

Well, I don't this sort of dovetails very neatly into, I guess a very, a landscape that's rapidly changing. Right now. And, and we've just talked about how I can work for us. Yeah. In terms of. Yeah. Yeah. Determining what, helping us out with our preferences, our privacy policy documents. But I can equally, work against us in terms of harvesting the data that we've already put out there.

 

00:29:00:06 - 00:29:08:18

David Karsten

I mean, how how can we, I guess how can we fortify ourselves in terms of our privacy protection with with this, I guess on coming on.

 

00:29:08:20 - 00:29:28:11

Dr Adam Andreotta

That is difficult because in order to power these AI agents, they will need personal data. And that was sort of back to our original kind of problem. So I suppose, we don't want to trade one problem for another, but we would have to sort of figure out how to safeguard this and make sure that the data is only collected in a way that aligns with our preferences, make sure that no one else has access to it.

 

00:29:28:13 - 00:29:45:01

Dr Adam Andreotta

If I knew your privacy preferences with an AI agent, that might be very valuable to me as a, an advertiser, because now I can work out your vulnerabilities. What do you allow? How can I try to seek and on that. So safeguarding it from that would be a big one as well. So we don't want to trade one problem for another.

 

00:29:45:03 - 00:30:12:19

Dr Adam Andreotta

We also need to be careful about error. Right. Because of course AI cause as you alluded to before, sometimes crash. What happens if our AI privacy agents actually crash, too, in the sense that give permission for data to be shared, in ways that don't align with us because they may be, you know, missing for something or maybe even we made a mistake when we were specifying it as something that would be a worry to you as soon as we, give away control to something else, we put our trust in the technology.

 

00:30:12:19 - 00:30:16:00

Dr Adam Andreotta

What if the technology kind of goes wrong?

 

00:30:16:02 - 00:30:28:09

David Karsten

Would you say we were, I guess even with the with the measures that you're suggesting, are we still, maybe a step or two behind the technology? I mean, it is moving so quickly, isn't it? Yeah.

 

00:30:28:11 - 00:30:51:07

Dr Adam Andreotta

I think so. Because, of course, every time that we saw one problem, a new one kind of, comes about. And what's really difficult, you mentioned the repurposed data problem before, you know, somebody might be perfectly happy with, with sharing their, paying data. You know, your phone, mobile phone, thanks to the tower. And then next week, you find out that that data has been sold to a company to work out, how to track someone.

 

00:30:51:09 - 00:31:09:22

Dr Adam Andreotta

And so suddenly that's a risk that wasn't there yesterday. So new risks come about sometimes overnight. New technology, which we can maybe talk about in just a moment, comes about where some of these methods may just be not effective. So it's a constantly, changing space. And some of these solutions might be really good for some of the current problems.

 

00:31:09:24 - 00:31:23:16

Dr Adam Andreotta

But then tomorrow, maybe there's a new set of problems and, you know, and, and advertisers and companies are going to be trying to exploit some of those. Right? They familiar with the laws, familiar with how people protect their data. And they may be able to exploit something else.

 

00:31:23:18 - 00:31:46:21

David Karsten

Well, a really interesting aspect of of, privacy and, and ownership, of of data and, and ownership of, I guess creative license is, is a really a big deal at the moment as IOI needs material to learn and and are you watching, are you watching on with a great deal of interest as this sort of unfolds?

 

00:31:46:21 - 00:31:57:11

David Karsten

Adam, we're we're artists are sort of stepping up and saying, look, we've not given anyone any permission to use my painting or my song or my writing to teach these machines.

 

00:31:57:13 - 00:32:19:09

Dr Adam Andreotta

Exactly. And just before I, as you mentioned, the data has to be in it used to make these amazing things. ChatGPT a large language model is brilliant, but it's so important to remember that there's training data. Where did they got the training data from? There's actually a lawsuit, last year or the year before where a lot of, authors, were very unhappy about, no, they didn't give consent for a lot of this.

 

00:32:19:11 - 00:32:42:07

Dr Adam Andreotta

Yeah. Because, because and what makes an author's book really good is that the English is typically pretty good because this is a professional writer. So this is just the sort of thing you want to train your AI model. You don't want, like some, blog off Reddit or something where there's probably going to be mistakes and things you want to get, you know, Nobel Prize winners or Man Booker Prize winners, really good works of English language or any other language that is used to train.

 

00:32:42:09 - 00:33:00:22

Dr Adam Andreotta

And this is the stuff you want. But of course, how do you get permission to do all that? What if it's the work of James Joyce or David Hume, or authors who have passed away? How do you even start to get permission for that? What OpenAI, it looks like they have done is just use them and sort of let's deal with the repercussions sort of later on.

 

00:33:00:24 - 00:33:05:07

Dr Adam Andreotta

Which is, you know, from an ethical perspective, it's, pretty problematic.

 

00:33:05:08 - 00:33:13:03

David Karsten

Oh, it's fraught. And because once the genie is out of the bottle, so they say there is no going back and and, and and where does compensation come from?

 

00:33:13:09 - 00:33:34:05

Dr Adam Andreotta

But it's even difficult more difficult than that because a lot of the time, even if the tricky part is to know whether those texts are actually used, because yes, there are lots of famous works, but there's also lots of amateurish works that describe those works. So my blog that I wrote, I didn't, but I say imagine on James Joyce or some Dickens or something that could be actually used to train the AI.

 

00:33:34:05 - 00:33:50:01

Dr Adam Andreotta

So the question is, did the original Dickens work or some other work get used to train it? And they're it's really difficult, especially when the companies aren't being transparent about the whole thing. You know, even, you know, blogs use social media posts, maybe you put all those together and you can sort of piece together what the book was about.

 

00:33:50:01 - 00:34:05:07

Dr Adam Andreotta

Or, you know, you look at reviews otherwise. And, and these authors were suggesting that, you know, our books were used. But of course, the companies can say, oh, no, your book wasn't used. And maybe they're wrong, maybe they're right. But in some ways, you don't even need those books because you have everything else at the table have written.

 

00:34:05:07 - 00:34:07:17

David Karsten

Yeah, but but have those secondary sources second.

 

00:34:07:18 - 00:34:24:04

Dr Adam Andreotta

Yeah. Exactly. So it's really hard to know how. Yeah. Even how to do this. And I guess that's why I open I have just done it. And then they all thought, let's just deal with it later because it's so but this I think is very typical of the saying. Yeah. In, in in most companies will just act, they will do that, move fast and break things.

 

00:34:24:06 - 00:34:29:24

Dr Adam Andreotta

And then they do with the, the legal fees or the legal problems a little bit after. It's almost like a sense of negligence, right?

 

00:34:30:00 - 00:34:31:02

David Karsten

A big time, you know.

 

00:34:31:02 - 00:34:43:03

Dr Adam Andreotta

So that's also really problematic about this, which is why the law, which can take a long time to come in, you know, even if they win their lawsuit against OpenAI, I think about how many years that will take. The damage is kind of already done.

 

00:34:43:04 - 00:35:12:05

David Karsten

I've, I myself have sort of encountered a very interesting scenario within, our space here in podcasting and voice over recording where, where voices can be sampled and then, and then subsequent voice overs can be recorded just using that sample, as a basis. Yeah. But, there are, there are companies that offer, something called ethical AI where the voice over artist has, has actually recorded, you know, a foundation.

 

00:35:12:07 - 00:35:34:19

David Karsten

Yeah, a foundation from which to sample. But then every voice over recording that is made subsequent to that, they receive a portion of that income of that revenue. So, yeah, an arrangement was made up front as soon as this, this technology arrived. And to me, that's that, you know, that goes some way toward finding a balance. Yeah.

 

00:35:34:19 - 00:35:43:23

David Karsten

In a way. Yeah. In in terms of embracing the technology, but also acknowledging the originator of of the voice. Right. Yeah. Yeah, it can be done if some thought is put into it.

 

00:35:43:23 - 00:36:04:23

Dr Adam Andreotta

Yeah. I think what that example shows really nicely is that it's not bad in and of itself. There's lots of really good uses for. And what we really want is the appropriate uses for the ethical use. People consulted about it. Is there consent given, you know, like the examples like that are, I think, really positive and just shows that I think the solution to this is to really think about how the different stakeholders are affected.

 

00:36:04:23 - 00:36:26:23

Dr Adam Andreotta

And it's the, the operating in the background without. I mean, again, that's why I think consent here is such an important issue. Things are done without consent. The example that you gave is a case involving someone that gave explicit consent, and they were consulted about the matter because it's something very important to them that work. Right. Which is why I think, you know, for me, a lot of these issues really come back to consent Future Technologies.

 

00:36:27:00 - 00:36:37:08

David Karsten

You wanted to talk about that. You've got something to say. Tell us. Tell us about what's in store for us and what, I guess we are all going to have to contend with and what you're for saying.

 

00:36:37:10 - 00:36:59:15

Dr Adam Andreotta

Well, I think, you know, some of the stuff you could, people, people could object by saying, look, it's just personal information. Who really wants my, you know, browsing data or IP address, but some of these new technologies and maybe I'll just talk about one, the, emotional AI, this is AI technology that's built upon biometric data. So it looks at your face and it uses technical AI to figure out what emotion you're currently undergoing.

 

00:36:59:17 - 00:37:05:20

Dr Adam Andreotta

And what's really challenging about this is that that technology can be used, let's say when you walk into a store.

 

00:37:05:22 - 00:37:06:08

David Karsten

Oh, okay.

 

00:37:06:08 - 00:37:24:01

Dr Adam Andreotta

So it can sort of work out, you're walking it in front of the store, you maybe look at a TV and you look engaged. However, that might be defiant. And if you look engaged, well, maybe you might give more attention to you. Or maybe the shop person doesn't have to waste as much time with you because you look disgusted and you all that sort of person.

 

00:37:24:03 - 00:37:42:04

Dr Adam Andreotta

And that was tricky about that, is you just walking into a store. Where is the consent actually given for that kind of stuff? When you're online, you can have all these fancy solutions that have suggested your AI age and all that kind of stuff. But when you're out in the the wild, if you like, video footage can be taken, it can be coagulated with other data.

 

00:37:42:06 - 00:37:59:16

Dr Adam Andreotta

And so the more of this happens, I think we really have to focus on what's actually being collected, because as soon as emotional AI starts to come into things, I think the chance of manipulation gets, far greater. And this is an ethically, looked at than it can be. Yeah. Pretty bad.

 

00:37:59:22 - 00:38:07:08

David Karsten

What's so nebulous is this is this particular area where you'll be throwing your focus and attention to what, over the next few years?

 

00:38:07:10 - 00:38:20:19

Dr Adam Andreotta

I hope so. I think it's a really interesting kind of space. It's funny that some of these, I mean, a lot of these companies have you can watch the videos on YouTube of the companies and how they're doing it. They're very open about what they can do. Everything from giving students nudges when they're watching a screen online.

 

00:38:20:19 - 00:38:38:22

Dr Adam Andreotta

It looks at their, you know, their biometric data. If they're not paying attention. And they say it's really good to have in cause, because if you're in sort of falling asleep, they can nudge you with something. So they're sort of advertising all the benefits, which we talked about before. But I suppose there's a role to describe the risks, because a lot of the time it's really hard to say this risk.

 

00:38:38:22 - 00:39:03:00

Dr Adam Andreotta

So I suppose the more we talk about those risks, the more becomes important about, you know, the consent has actually given, you know, because when when the risks are high, that's fine if you're willing to take on those risks. But you have to be aware of the risks to give proper informed consent. If you're not aware that operation has a 98% chance of, you know, of killing you, then that's that's a pretty bit you can't give informed consent unless you know that doctor.

 

00:39:03:00 - 00:39:21:06

David Karsten

Adam Andreotta. What, a massive topic to try and cover in a few short minutes. And, we really appreciate you coming in today. And we look, we very much look forward to, I guess how your work will take shape over the next few years and hope that there will be an opportunity in the future to for you to come back and talk to us about how things have progressed since today's chat.

 

00:39:21:12 - 00:39:21:21

David Karsten

Thank you.

 

00:39:22:01 - 00:39:24:07

Dr Adam Andreotta

Thank you so much for having me. It's been great to talk to you.

 

00:39:24:09 - 00:39:37:18

Sarah Taillier

You've been listening to the future of a podcast powered by Curtin University. As always, if you enjoyed this episode, please share it and don't forget to subscribe to the future of on your favorite podcast app. Bye for now.