The Legal and Ethical Implications of Border Technologies and Human Rights with Petra Molnar - Ep. 55

Download MP3
Jack Sanker:

Welcome litigation nation. I'm your host, Jack Sankar, along with my co host, Danessa Watkins. Today, we're joined by a very special guest, miss Petra Molnar, who's a lawyer and anthropologist specializing in migration and human rights. Petra has been working in this space since the mid 2000 as a community organizer. Her work, as we'll get into it, has taken her to conflict areas and militarized spaces in Jordan, Turkey, Greece, the Philippines, Palestine and Israel, Kenya, Colombia, and on the US Mexico border and all over Europe.

Jack Sanker:

She's currently the associate director of the Refugee Law Lab at York University and faculty associate at the Merckman Klein Center For Internet and Society at Harvard University. You could find her published in the New York Times, The Guardian, Al Jazeera, the Transnational Institute, and all over the Internet. She joins us today to talk about policy and law of border security, migration, immigration, surveillance, which she read about brilliantly in her new book, The Walls Have Eyes, Surviving Migration in the Age of Artificial Intelligence. Petra, welcome to the show.

Petra Molnar:

Thanks so much for having me. It's a real pleasure to be here with you today.

Jack Sanker:

I have a whole section here that's kind of like your bio bullet points of who you are and everything else, but I think it would probably be best if you just summarized, who you are, what your background is, and why it is that you're writing on this topic.

Petra Molnar:

Yeah. Sure. I'd be happy to. So like like you said, I'm a lawyer and an anthropologist by training, but I'm not a technologist. And about 6 years ago, I fell into this area of trying to understand how new technologies are impacting the way that people are moving and crossing borders.

Petra Molnar:

And it's taken me on a bit of a wild ride all around the world and culminated in the publication of my first book called The Walls of ICE, Surviving Migration in the Age of Artificial Intelligence, which tries to tell this global story of how AI and surveillance and drones and all sorts of other experimental projects are playing out across the world's borders.

Danessa Watkins:

Did you get into the technology aspect of it just because of your personal experiences and and seeing, you know, what was out there and what was being used and maybe things that we mainstream don't hear about?

Petra Molnar:

Yeah. Kind of. You know? So I, am a practicing lawyer, although I don't take too many cases anymore. But back then, I was, you know, doing more, I guess, you could call it kind of traditional issues, you know, things like gender based violence, you know, other types of issues like immigration detention, but not technology.

Petra Molnar:

And it was really as a result of an event that we held at work, kind of on technology and power, trying to understand how, new projects that are kind of being introduced are exacerbating racism, discrimination, and just terrible experiences that people are having. And then I started asking questions like, what is happening on the immigration side? You know? But, again, I'm I'm not a tech person by training. And back then, my knowledge of these things was really limited.

Petra Molnar:

We're talking like Wikipedia level knowledge of what's an algorithm. And I found it quite intimidating because sometimes I think the technology space can be a bit difficult to to penetrate. But I saw time and again that these new kind of projects and technologies were being introduced without really having a conversation about their human rights impact, what it's doing to real people. And from a legal perspective, that was really troubling for me. And so, yeah, it was a bit of a happy accident, I suppose.

Petra Molnar:

But it taken me, yeah, on this on this journey of of trying to understand this interplay.

Jack Sanker:

And Danessa was asking about, you know, your, like, quote, unquote personal experiences, and I think it's, like, well worth highlighting here that, maybe unlike, some academic work here, this book and what you had to do to to get this book together involved quite a lot of, like, literal boots on the ground, going to hotly contested, sometimes militarized spaces, making you know, I I don't know I don't know what you wanna call yourself in this scenario, but I I would say, investigative reporting to some extent, and and actually kind of getting your hands dirty, quite literally. And so if you wouldn't mind just kinda giving the audience a taste for where you were physically, and and what you kind of the spaces you were in as you were putting together would eventually turn out to become this book.

Petra Molnar:

This is, I think, where the kind of anthropology hat comes in, this commitment of trying to work from a grounded perspective. So being present in spaces where, these technologies are playing out. And, again, it kind of happened organically, but as I was expanding my own understanding of how new technologies are being used at the border, I went from Canada to different parts of Europe. I ended up living in Greece for almost 3 years because it was one of the kind of key sites of technological experimentation, at the border. But then also because I've always worked from a kind of a comparative perspective, I find comparison to be a very, very useful tool of analysis.

Petra Molnar:

And so being able to say, okay. Well, this is happening, along the fringes of Europe, but what is happening at the US Mexico border? What's happening in East Africa? What's happening in occupied Palestinian territories? All of these spaces speak to one another.

Petra Molnar:

But it you're right. I mean, it is kind of ultimately coming from a commitment of being present in these spaces because there's only so much you can do from the ivory tower or an office somewhere.

Jack Sanker:

Right.

Petra Molnar:

And I think it's perhaps also motivated by the fact that I, you know, have my own migration story that has motivated, I think, perhaps my, work in this area more generally. But also the commitment to try and tell complex stories from a human perspective because I really didn't wanna write an academic book that would be sitting on a shelf somewhere that was just impenetrable and and and really heavily written. This is meant more as a a story to be able to illustrate, some of the impacts of of this technology on real people.

Danessa Watkins:

Yeah. And that's what I I mean, I flew through your book. So, I I think that's great about The Walls Have Eyes is you kind of start out with this personal story of something you experienced, people you met along the way, their struggles, you know, sharing their perspective, and then describe the the technologies that are being used, really, I don't know, unfairly, illegally, I would say, even though maybe the laws aren't in place yet, but, it it puts a real, humanization to, what, like you said, could otherwise just be kind of a stale talk about these technologies. But once you start meeting these players and the people that you came into contact with and how it affects their lives, It, yeah, it's a page turner, I will say that. But also it it made me, feel like, wow, I my head's been buried under the sands to not realize, you know, how how these things are being used.

Danessa Watkins:

So can't say enough good things about the book. I've already shared it with all of my family. So I actually my, mother-in-law is in Arizona, and she had no idea about the the robot dogs that are being deployed there. Can you discuss a little bit about that?

Petra Molnar:

Yeah. For sure. I think that's that's probably one of the more visceral so when I first went to Arizona, as, again, one of these spaces where technological experimentation intersects with the border. I also had no idea that it was going to be kind of one of these flashpoints for, again, these really high risk technologies that are essentially something that you might see on, you know, a sci fi episode of, like, the Black Mirror or some other show, you know, that Literally. Yeah.

Petra Molnar:

Literally. Right? And so what when in Arizona, I started working with, some search and rescue groups that are there, to assist people who are crossing the desert. You know, because the Sonora Desert is such a beautiful place, but it's also a very deadly deadly place, and people lose their lives as they're making the journeys, to try and claim asylum. And then you have these search and rescue groups that are often also sometimes made up of, you know, 70 80 year olds, who instead of relaxing on a sun lounger, are going to drop water in the desert or show up for for people in these really kind ways.

Petra Molnar:

And I, spent some time with one particular group called the battalion search and rescue, and we went and trekked into the Sonora Desert to visit a memorial site of, mister Alvarado, who was a young husband and father who essentially died, mayor kilometers away from a major highway as a result of these kind of draconian surveillance technologies that are pushing people more and more into the desert. But right around that time, in probably one of the most surreal moments of my career, and now that you've read the book, I think you've probably seen I've had many of such moments. It was right around the time that we were literally on the sands of the sonora, the the Department of Homeland Security announced that they would be augmenting their kind of migration management arsenal with Robodogs, which are essentially a quadruped military grade technology that is autonomous and that would be joining this arsenal of of border enforcement tech and chasing people like mister Alvarado through the desert. I mean, this is some of the sharpest kind of manifestations of the technologies that that I explore in the book, but by no means the the only tech that's also used at the border.

Petra Molnar:

You know? I mean, especially the US Mexico territory is one where so much technological experimentation occurs, whether it's, like, AI towers that are dotted throughout the Sonora, different surveillance equipment, license plate readers. It creates this surveillance dragnet that people are increasingly caught in.

Jack Sanker:

The term that that we're using here, which I find so interesting, but it's worth connecting the dots on, technological experimentation. It it's implied, and I think I mean, it it's also true, but it's implied that, the different policies, technologies, or or just ways of doing things are being tried out, on, the borders, not just the United States, Mexico border, but you get into a lot of detailed places all over the world. But the the term, you know, experiment is is implying that, you know, they're gonna take certain elements of those things. They either work or don't work and then will broadly end up being applied towards domestic populations. Right?

Jack Sanker:

This is, I I think it's Foucault's Boomerang. I don't know if you know. Yeah. It's it's exactly that, which is, you know, a police, policing tactics or, technologies that will work against, you know, a a population abroad will inevitably find its way back to the domestic population. And, and you give several examples in the book.

Jack Sanker:

The the drones that are being deployed on the southern border of the United States, where they're not being used by, like, Department of Homeland Security or, or whatever other agencies are using them, they're being lent to local law enforcement, for example. Many, many things like that. So I I think it's so interesting because if you're coming from this you're looking at this kind of, you know, border security thing, which is, like, obviously, a very political issue from whatever perspective you wanna come at it from,

Petra Molnar:

I

Jack Sanker:

think it's extremely politically charged, but it's so interesting because you ought to care. You you should care what's happening down there for from whatever perspective you're approaching it from. And anyone who has this sort of, you know, maybe viewpoint that's common on the right of, like, big government surveillance or tracking and all those things. You don't have to be conspiratorial. It's, like, literally happening, and and this is the, you know, the prover account for it.

Jack Sanker:

Right?

Petra Molnar:

Yeah. Exactly. Exactly. And, you know, even the robodogs that we were just talking about, right, a year after they were announced at the US Mexico border, the New York City police department held a press conference and proudly announced that they wanted to use robodogs on the street of New York to quick, to, quote, keep New York safe. And one was even painted white with black spots on it like a Dalmatian.

Petra Molnar:

I mean, again, it's this kind of normalization of tech that starts at the border and then bleeds over into other facets of public life. So I think it's it's kind of a twofold thing here. Right? Like, I think it's really important to pay attention to what happens at the border because they are spaces that are, you know, very opaque, very discretionary, difficult to understand why certain decisions get made, creates this kind of perfect proving ground or testing ground for experimentation that maybe at first blush wouldn't be allowed in other spaces. But then what happens is over time, it becomes normalized and then starts proliferating into into other facets.

Petra Molnar:

Whether that is, again, the robo dogs on New York streets or the surveillance of protesters or even sports fans. There are now conversations about using facial recognition in sports stadiums, for example, right? So again, it is this normalization of tech that we really need to pay attention to. Well, I was

Danessa Watkins:

gonna say a couple shows ago, on our podcast, we were talking about, biometric data and, recent laws that were passed, regarding that, or that are being used more frequently in the US, to try to protect people from the collection of that type of data or the use of it or how it's stored. So I did think it was interesting, I guess, and not surprising that, the United Nations High Commissioner question of, well, okay, so they have this huge database, Now who has access to it? I saw that it's being shared with the US, even if those refugees aren't making it here. You mentioned that it's vulnerable to hacking. I mean, data is king now.

Danessa Watkins:

So, yeah, once we have all of this information, very sensitive information about people, who has access to it, how is it protected, why is it being shared, you know, for what purposes, because the people who whose data is being collected, they don't have any say in that. So yeah. What aside from or if you wanna talk about Kenya, but, I'm sure that there are other places where this is happening and and, you know, what are the effects of that?

Petra Molnar:

Yeah. And thanks for bringing that up, Nessa, because that's kind of the underlying logic sometimes, the fact that data is the new oil. Right? And we need more and more data to power these technologies of surveillance and automation and just, you know, this kind of incursion of tech in our daily life. Like, it cannot be done without data.

Petra Molnar:

But then it also is important to ask who is the data subject and who can opt out from being part of these massive databases. Because ultimately, you know, the conversation I think that we're having is about technology, but it's actually about power. And it's also about the opportunity to say no. I don't want to be part of this. And when we're talking about spaces like refugee camps, that is where there's a huge power differential.

Petra Molnar:

And that's even inherent in the way that international organizations like the United Nations, is operating in in a pretty problematic way because they've also normalized biometric data and iris scanning, for example, in refugee camps in camps in Jordan. Instead of people using an identity card, they now have their eyes scanned, and then they get their weekly food rations. Like, at first blush, that sounds good, because, of course, you want to be able to disperse humanitarian, support quickly and efficiently. But how do people actually feel about that? Right?

Petra Molnar:

Well, I mean, how would we feel if we went into, I don't know, a grocery store and all of a sudden it was like, well, we have to scan your irises before you can come in. The people would be up in arms. But when that actually happens in a refugee camp, those discussions are not being had. And, you know, I've spoken with people who say, well, I feel really uncomfortable about this, but if I don't get my irises scanned, I don't eat that week. That's not really free and informed consent.

Petra Molnar:

Right? It really breaks down along these lines of of just huge power differentials. And also it reduces people to to data points, rather than actually seeing their their full humanity. And one last thing I will say is, and I already alluded to it, is the vast kind of human rights impact of what happens when there are data breaches or when really sensitive information is inappropriately stored or shared. And this has already happened.

Petra Molnar:

The UN collected a bunch of information on Rohingya refugees who were escaping Myanmar, ended up in Bangladesh, have been living there for many years now. They collected all this information and then inadvertently shared it with Myanmar, the government that the refugees were fleeing from. I mean, that's a huge breach. Right? Huge.

Petra Molnar:

Yeah. And yet, how can this happen? We really need a lot more conversations about just why is even this data being collected in the first place, and why is it being shared and stored in these inappropriate ways.

Jack Sanker:

I find so many of the the kind of, like I I don't wanna, you know, take a strong political position on this, but so many of, like, the talking points from either side you wanna come at here. And the kind of conspiratorial maybe, you know, tinfoil hat stuff that people often will talk about and you kind of brush off, until but no one's actually, like, pointing the finger at the right thing when they kinda say this. And so the, 2017, extreme vetting initiative that you've discussed in the book from from, from which ICE, I I believe, is still using, or I don't know if they it's been halted or not. But which involves assessment of someone's, you know, personal social media, like, risk profiles that's based on, as I understand it, a completely opaque algorithm for which we have no idea what or very little idea what the inputs are. A lot of that information is is classified as something you mentioned in the book.

Jack Sanker:

Travel records, of course. And the the goal of this, you know, giant Rube Goldberg machine of social assessment of someone is, you know, to determine whether that person would, quote, be positively contributing would be a positively contributing member of society and predict whether they intend to commit criminal terrorist acts after entering the community or entering the country, unquote. And that is to me, I mean, that is so, you know, minority report. That is, you know, for people that bang their fists on the table about, like, you know, the Chinese social credit score or whatever. That that's kind of exactly what this is here.

Jack Sanker:

And for so it's it's so interesting because folks that may otherwise not be inclined to be sympathetic towards towards the the things that are happening to migrants at the border who, you know, haven't asked for it and don't deserve it, ought to care, right, because of the way that, you know, these are the big government surveillance and, things that, you know, they they they say that they're worried about. It's already happening. It's just happening to a population that no one cares about. Yeah.

Petra Molnar:

And that's exactly it. And it doesn't just stay there. Right? Like, just because things are tested out at the border, then they become normalized, and then, you know, it opens up the the ambit much, much more broadly. And I think the examples that that you're bringing up, Jack, are really quite disturbing.

Petra Molnar:

Right? Because they are forward looking. It is really like a minority report situation where, like, really, are we comfortable with governments making predictions about our behavior in the future and based on what? Right? Oftentimes, we don't even know what the parameters are.

Petra Molnar:

I mean, the extreme vetting initiative is perhaps the most extreme example of this, but other jurisdictions have done the same. New Zealand, for example, introduced, kind of a pilot project where they wanted to root out, quote, unquote, likely troublemakers. What does that even mean? Right? What grounds will that break down upon?

Petra Molnar:

Right? I mean, we can all imagine how incredibly racist and discriminatory this can be. Right? And I think, again, as we are seeing a shift to the right, all across the world and and having these really heightened conversations about migration, about security, about the environment, right, and the fact that many people are going to be migrating as a result of environmental degradation. If we normalize these kind of predictive technologies, chances are that we're gonna be seeing more and more of them.

Danessa Watkins:

Mhmm. Well, I was wondering too. I mean, we saw in the US, particularly, like, after 911, I think more Americans were open to the idea of higher security and more surveillance because we felt, you know, we had been attacked. Okay. Now we're okay maybe with the government monitoring more.

Danessa Watkins:

Did we see something like that with COVID 19 as well, where suddenly everyone kind of feels, vulnerable? And so maybe people are more open to, and I don't mean obviously people at the border, domestic, areas, are people more open to these technologies that are intrusive, simply because they feel more vulnerable?

Petra Molnar:

Yeah. Absolutely. You know, and and there's been others who've been doing really amazing work kind of tracking this vulnerability and and how it's also kind of weaponized against people in crisis. Right? Because, of course, I mean, the COVID 19 pandemic was and is a global health crisis.

Petra Molnar:

Right? And so it it makes sense that people who are all of us experiencing this kind of unprecedented event, that maybe, you know, psychologically, there's a predisposition to grasp at straws and say, well, you know, whatever we can get to make the world better, we are going to do that. But it actually there was a lot of laws, not even just surveillance, but, like, a lot of, you know, stretching of of what normally people would not be comfortable with. But again, because we were in a massive crisis, it normalized the ability of the state to say, well, we're gonna incur or create incursions on people's, you know, freedom of movement. Right?

Petra Molnar:

Mhmm. People's freedom of expression, data gathering indiscriminately, and and all of that. And a lot of that, those vestiges have remained with us. Right? And I think that's the concern.

Petra Molnar:

Whenever we are operating from a crisis mentality, unfortunately, you know, scholars have for decades, centuries, been kind of, trying to raise alarms about this, that crisis actually breeds the normalization of more and more control, right? And unfortunately, we are seeing, a time that is informed by crisis thinking. Right? Whether it's the environment or political crises or just the widening, you know, kind of divisions between groups. I think the concern is that, again, technology is a lens through which to understand power and who is around the table when we innovate on and why and who gets to decide.

Petra Molnar:

Like, yes, we need more surveillance or we need more robodogs or we need more data versus actually thinking about, well, how can we use technology to improve the lives of people? There's kind of that other side to it too, but, again, it always breaks down along the lines of power and privilege, especially in times of crisis.

Jack Sanker:

Mhmm. Speaking of COVID 19, you you write about in your book the, some of the lockdowns that were in place in, in European countries. I mean, I think also applicable here. But, in Greece, for example, the kind of lockdown procedures and protocols that were in place were, the government there took advantage of those to keep, migrants, for example, in camps, for much longer than they would have been under normal circumstances. And, again, that to me is just it just echoes so loudly of the, like, conspiratorial you know, if you're a member in the United States, like, out they're gonna use COVID to put us all in camps, you know, kind of crazy talk, that you, you know, ignore.

Jack Sanker:

And then here we are. Here's, like, a a, you know, quote, unquote, western country, which, kinda did that. And, and it's you know, it it kinda happened there. It's the, you know, Orwellian, stuff that, you know, a certain political class pundits in this country, you know, make their money talking about all the time. But because it happens to a group of people that, you know, no one seems to, worry about too much, it it's completely, you know, like, I learned this reading your book.

Jack Sanker:

And you would think that folks that were concerned about that type of thing would have learned this when it happened 2 or 3 years ago. And, and it just it flies completely under the radar just due to really the identity of those who are affected by it the most. And it's absolutely fascinating when you kinda think about it, through that lens, at least for me. Yeah.

Petra Molnar:

And I think there there's a lot of kind of flying under the radar, pun intended, since we're talking about radars.

Danessa Watkins:

Right. Yeah.

Petra Molnar:

That that happens at the border. Right? Because, again, they are spy spaces of opacity and also a space of crisis. Right? And we are seeing this around the world where borders are increasingly becoming militarized, and also difficult to access for human rights monitors like me, but also for journalists.

Petra Molnar:

Right? It it again kind of plays into this mentality that the border is this kind of free for all where anything can happen, this kind of frontier zone of experimentation, which has happened for many, many years. It's just now we have this tech that also plays into it. And that's perhaps where, you know, I remember, Denessa, you said something early on in our conversation that, you know, there's not a lot of law right now, and that's precisely it. Because we don't have a lot of guardrails around just technological experimentation period, let alone at the border, it exacerbates a lot of the kind of high risk projects that are playing out there, largely unknown.

Petra Molnar:

Oftentimes, you know, people find out about them as a result of some, like, random investigation or through sheer chance or a journalist does a piece and all of a sudden, you know, a new project hits the media. But there isn't this kind of commitment to transparency and public accountability when it comes to tech development generally, I would say, but especially at the border.

Danessa Watkins:

And I was like, I mean, in our in in normal society, a lawyer is held, you know, to a certain level of prestige and you get a certain amount of respect from your community. But did you find in going to these places that it was the opposite that you are now the enemy seen by some of these? I would I would think lawyers would stand right out as, nope, stay away, get out of here. Anyone with a camera, anyone with a law degree, anyone with the education that would spread the information.

Petra Molnar:

Definitely. Definitely. I mean, that was the case in in a variety of the different areas that I've worked in where it was almost almost a liability to say, oh, I'm a lawyer. You know, and but then on the other hand, you know, I think also when you are a lawyer, you, of course, are aware of of your own rights and also your ability to to kinda do your job depends on, you know, speaking truth to power, so to speak, and and asking the the hard questions. And so, definitely, though, I I noticed a trend where, yeah, more as as more borders become militarized and difficult to access, it's no accident that, states don't want people who are human rights monitors or who are journalists kind of poking their nose in there when, you know, a lot of, again, that kind of experimentation is happening there without public scrutiny.

Petra Molnar:

So it's a little bit of a double edged sword. You know? I think it definitely helps, to be a lawyer doing this work, but sometimes it's it can make you be seen as somebody that they definitely don't want there.

Danessa Watkins:

Yeah. Yeah. I I had a a friend who did some work in Guantanamo Bay, and he said every time he flew down there, it was like, you know, treated nice, treated nice. He's a gentleman, he's in a suit. And then as soon as he stepped off the plane and got there, it was like, he felt like he was one of the inmates.

Danessa Watkins:

Like the guards, you know, didn't want anything to do with him, like would put up, you know, roadblocks every step of the way, make him wait hours to talk to his clients, those sort of things. And I I just would imagine you would encounter those same sort of, yeah, I guess, arbitrary blockades, you know, when you get to these places.

Petra Molnar:

Yeah. Definitely. And the other the other issue that I have had, and it's not just unique to me, but I think to a lot of people in this space is trying to also engage with the private sector that's involved in this. And I know that's not something we've talked about yet, but they're a major major player in the development and deployment of this technology. Because oftentimes, states just can't do it themselves in house.

Petra Molnar:

Right? Like, they need to contract out these projects to private companies. And and when you go to, you know, some of the conferences where this technology is being sold and kind of proposed as a solution to these complicated, quote, unquote, societal problems. Right? Like, oh, you're worried about refugees?

Petra Molnar:

Here's a robodog. If you buy this, this will solve your problem. That's the kind of thinking. As soon as they see that you're a lawyer or an academic, they don't wanna talk to you. Right?

Petra Molnar:

Because they know that you're there trying to get at the parameters of what they're doing and why and all of this. And so that was definitely something that I was, you know, I wouldn't say I was surprised by, but it was like yet another element. Because when you're trying to understand this ecosystem and this multibillion dollar border industrial complex that has grown up around the development of migration management tech. Trying to understand how the private sector fits into it and, like, following the money is very difficult because a lot of it isn't public either, and they don't wanna talk about it.

Jack Sanker:

I I'm so glad you introduced me to the term, or to the phrase, border industrial complex because I could just, like, mentally, like, find and replace, like, do, like, control f, like, find and replace, like, military industrial complex and, like, all of my, like, skepticism and criticisms of, you know, like, foreign engagements or whatever it is. I could just, like, take those things, like, cut and paste, and now I have a useful framework for thinking about, when I hear something in the news or when I'm just thinking about, like, the border, that's such a useful, phrase for me to, like, conceive of, you know, everything. Because I I think today, in particular, I mean, today, it it and this is kind of like bras broadly cross, political spectrum, like, skepticism of, you know, foreign engagements. I mean, you know, like the all of a sudden, there's a whole cohort of folks that are, you know, isolationist or whatever you wanna call it. And so you the term, you know, military industrial complex is so ubiquitous, and it's thrown around and people understand inherently now, like, what that means.

Jack Sanker:

And so to think to just go, oh, like, the border industrial complex. For me, it's just like, oh, like, that just clarifies so much. I don't have to do any more thinking from, like, the ground up. Like, it's the same principles all apply. And, and I, like, I I remember reading the book.

Jack Sanker:

I was like, oh, that's just I I I understand so much about this now just from flipping that switch in my own brain. It's very helpful.

Petra Molnar:

Oh, thanks. And, you know, I I can't take credit for that. I mean, I would also urge listeners to check out the work of Todd Miller, who's an amazing journalist in Arizona. He's been working on the border industrial complex stuff for years, and I think it's a really helpful framing. Yeah.

Petra Molnar:

Because, again, it it centers the kind of capitalism that, is inherent in the way that we develop technology and the fact that there's big money being made at the border, just like in military engagements, but also at the border.

Jack Sanker:

Well and also in that the the investments in it aren't necessarily, like, optimized for, you know, solving the problem so much as they are optimized for prolonging the spending of money. So and it is Exactly.

Petra Molnar:

Yeah. Exactly. And I think that's something that we don't really talk about. Like, I I actually think we've kind of lost the plot on a lot of the conversations around migration because, you know, and I understand that it is a very complex, very fraught topic that is very divisive. But, you know, for me as someone who's been working in migration on and off in different ways since, you know, 2008, All the people that I've met and it's been 100, right?

Petra Molnar:

Nobody wants to be a refugee, right? Like, the thing is people are forced to move and they're also exercising their internationally protected right to do so. The right that belongs to me, to you, to listeners, to everyone on the planet. Right? But they are moving because of war, because of destabilization, oftentimes because of western action, right, in in places around the world, environment, all sorts of reasons.

Petra Molnar:

So I think we actually need to flip the conversation and say, well, how do we address the root causes of forced migration? Like, how can we support communities locally and spend even a fraction of this money that is being used in the border industrial complex. Like, I think it's projected to be around $70,000,000,000 in the next couple years. I mean, even a fraction of that would go so far to try and, again, deal with the root causes of displacement. That's where we need to start.

Petra Molnar:

Because, again, you know, I think we've we've painted people on the move as threats, as criminals, as people who are taking advantage of the system somehow versus our looking at them as human beings and and something you know, migration can happen to all of us. Right?

Jack Sanker:

Mhmm. Yeah. And, you brought up the, sort of international rights of movement and these things. And I you know, this is a kind of a lawyer podcast whose whose audience is probably largely comprised of a a lot of other lawyers, things like that. 2 concepts that I I thought may be worth mentioning, nonrefoundment, and then there is the, the 1951 refugee convention.

Jack Sanker:

To me, seemed to be, like, through the underpinnings of this, idea of internationally protected, rights of movement. And we're not I'll I'll speak for Vanessa here. We're not the type of attorneys that have to deal with this. So could you could you tell us what those things are and, and why they're important and to who those those rights or obligations apply to?

Petra Molnar:

Yeah. Sure. So I'll take off my anthropology hat and put my lawyer hat on. I'm a reluctant lawyer, but still a lawyer. Yeah.

Petra Molnar:

Yeah. Absolutely. I mean, again, I think this is one of the foundational underpinnings of these conversations that we sometimes forget about because, you know, it's it's, again, it's a right that is internationally protected and available to all of us. The, the right to leave our country of origin if we face persecution, if we face danger, and to seek, support somewhere else. And so the 1951 refugee convention, which was drafted after World War 2, is the foundational international legal document that stipulates that, you know, there's a legal test that you have to meet.

Petra Molnar:

There are 5 categories on which you can claim refugee protection, but you have the ability to leave your country and seek protection elsewhere. The nonrefourmal principle is a norm under international law that states that you are not allowed to remove someone from your country if they will face persecution in the place that you're removing them to. Right? And this is something that, again, I think is under threat a lot these days when we talk about practices like pushbacks or the ways that, you know, different states, for example, are not even allowing people to come on their territory and claim asylum. It's almost like some states wanna have it both ways.

Petra Molnar:

Right? Like, they wanna say, oh, we are human rights respecting. We have signed and ratified all these nice pretty international documents and also imported them into our own domestic legislation. Right? Like, in the United States, Canada, Europe.

Petra Molnar:

But then actually when one push comes to shove, it's like, well, we're going to derogate from them and and and actually create this other regime where people are not even allowed to actually enter, the territory when those rights kind of kick in. But again, there is a really foundational strong framework on which to pin responsibility to when we're talking about asylum law. This is, you know, an established also line of jurisprudence that has been with us for, for decades now. But we are seeing this weakening of these norms, unfortunately, these days, and technology plays a part into that because it also kind of disaggregates the border, right, from its physical location, not to get too theoretical. But if you can use technology, right, to push your border either vertically into the skies through drones and different types of surveillance or horizontally through data collection, different types of surveillance as well, then people are not even making it to your border in your in the first place.

Petra Molnar:

Right? You're kind of creating the zone of bordering that is much wider and much more difficult to cross, and that's no accident. There is definitely a weakening of of this regime that we've seen across all these different jurisdictions around the world.

Jack Sanker:

Well, I think it's because now, the promises and obligations that, many of these countries, you know, purport to support are actually being tested, maybe for at least in large numbers for the first time in a long time. And it's kind of a, do you actually support these ideas or not? And it's like push comes to shove. Actually, we don't. We actually don't support them.

Jack Sanker:

And, in fact, you know, one way we can avoid, having to abide by these treaties or laws, whether they're international or domestic, and avoid incurring obligations to refugees or migrants is to, you know, for example, build a huge wall so that you can't come here and and claim them. That seems to be how like, that seems to be what was happening in I mean, the the example that jumped out to me was your time on the, on the border of, Greece and Turkey, and this was in the what I can't remember what region it was, but it was a the part of Greece that's, like, close to Istanbul and, like, Bulgaria. I don't remember the name of it. But where it's, like, you know, otherwise, like, a pretty, like, agricultural and kind of, like, I don't wanna say backwater, but certainly not a a node of power in Greece, but all of a sudden is getting, you know, 1,000,000,000 of dollars in fundings to build, like, AI sensor towers and, like, all of this, security technology, just because it's on the border and it to keep, refugees from coming into to Greece. You know, to keep them from coming in because the moment they set foot inside, they, you know, have a claim to certain rights and privileges, whether the the country wants to uphold them or not.

Jack Sanker:

But at least they have they have that claim once inside. So it seems like, you know, the investment is coming, you know, to a kind of a poor backwater part of the country that would love to have, you know, government investment in any other sector, I'm sure. But what they get is, like, you know, sensor towers and robot dogs, to to keep out, you know, even, poor and more needy people from coming in.

Petra Molnar:

Yeah. Absolutely. But it but what's also interesting, right, is, like, so many states use precisely this logic of deterrence, of saying, well, we need more technology. We need a wall. We need Robodogs.

Petra Molnar:

We need AI sensors. Let's put them in these spaces that are sometimes, you know, economically disenfranchised. Right? And we're going to prevent people from coming. But the thing is it doesn't work.

Petra Molnar:

Right? People will just take more dangerous routes to avoid detection and to avoid surveillance. This is something that's been documented at the US Mexico border, where since the introduction of this so called smart wall and smart border regime, deaths have nearly tripled. And because people still continue coming. Right?

Petra Molnar:

They're just taking more circuitous, more dangerous routes through the desert, which leads to an increased loss of life. But walls don't work, right? When people are desperate, And also again, when they're, when they're exercising their internationally protected rights to asylum, they're going to find a way. And if I can just share an anecdote very recently. So when my book came out in May, I went to Arizona to share it with some of the search and rescue groups there.

Petra Molnar:

And we had a really nice time in Tucson and some of the neighboring communities. But, you know, because of my ethnographer hat, I was like, what's happening at the border? Let's drive down and see some of this new smart wall stuff that's been introduced and, like, what's changed since my last visit there about a year ago? And so I went down there with journalist Todd Miller, and we were just kind of, you know, standing there shooting the shit, like, talking about what's happening. And we were in a Nogales, which is a border town that's bisected by the wall.

Petra Molnar:

So you have Nogales in Mexico and Nogales in Arizona. And we were right by the wall, which you can go to. You can see it. You know? There is sometimes, customs and border protection trucks kind of rumbling by.

Petra Molnar:

There's a fixed, integrated kind of, surveillance tower there. You know? There's now a new smart track, which I guess is gonna have all sorts of sensors there, whatever. And it was a really, really hot day. And we're kinda standing there sweating, really dusty, you know, and looking over into this area where the tower stands.

Petra Molnar:

And all of a sudden, I notice movement out of the corner of my eye. And we literally saw a young man scale the wall, jump down in front of us into the sand, right underneath the surveillance tower, shake himself off, and run off into the neighborhood. Right? And to me, that is such an example of just, like, how these surveillance techniques don't work. Right?

Petra Molnar:

Because people will find a way. So instead of spending and wasting all this money on surveillance tech, like, let's think about what else we could be doing to improve the asylum system, to give people access to lawyers, to support them psychosocially, again, to address root causes of migration in the first place rather than investing in surveillance technology that doesn't even work. Then it also makes me think though, like, is it about the performance of surveillance? Right?

Jack Sanker:

Right.

Petra Molnar:

Rather than it actually working, it's about the state making us feel like they have it under control and they're, like, introducing all this tech and then and now it's not actually working in the first place, but maybe that's not the point. Maybe it's just this theater of the border that we have to pay attention to.

Jack Sanker:

I mean, it's manifestly not working. And and if if your if your metric is, you know, keep keeping, let's say, people out, well, anytime you turn on the news or look on TV, you see you see people screaming about how many people are coming in. So it's, like, objectively and manifestively like, not manifestly not working if that's the case. No matter how much money is being spent, there it's it's still happening. So to your point, perhaps, perhaps this technology and everything else is really not a worthwhile investment, even if that is your goal, which I'm not saying it should be.

Jack Sanker:

But if it is, it's not working. There's they're, like, still coming as you mentioned.

Danessa Watkins:

Obviously, not surprisingly. AI has been all over the news for a bunch of different reasons, but it seems like finally, the European Union, is paying attention to the dangers of AI. I think on a global level, we are. People are, you know, scientists are coming forward and saying, you know, here's what can happen. We need to pay attention to this and how it's being used.

Danessa Watkins:

But specific to your line of work, you've gone to some of these conferences. I know some of the big ones before the UN. What are you seeing with regards to where we're at in trying to regulate artificial intelligence when it comes to the borders?

Petra Molnar:

Yeah. It's it's a really interesting moment when it comes to the kind of governance and regulatory conversations that are being had, and and, unfortunately, it's been a bit disappointing, in the last little while. I wanna point specifically to the European Union's AI Act, which was, you know, something that was in the works, this big omnibus piece of lawmaking to try and regulate AI. And it took many years to come to its final form, right, which which happened, earlier this year. And, you know, there was a group of us in full disclosure, some lawyers, some civil societies, some academics, trying to get members of the European Parliament to think about, the human rights impacts of border technologies and call for some amendments to the act.

Petra Molnar:

For example, saying you need to ban predictive analytics that are used for border interdictions or pushbacks. Right? Like we were talking about with the principle, something that's actually blatantly legal under international law or using individualized risk assessments using AI at the border, things like that. But again, given the fact that, migration is a very politically charged issue, a lot of these amendments just didn't pass. And so now when you actually look at the text of the AI Act, at first blush, it might seem like, well, you know, what's the problem?

Petra Molnar:

There's a risk matrix, you know, that is used to assess different technologies, and border tech largely falls under high risk technologies. So isn't that a good thing? Well, except, of course, for the legal listeners in the room, it won't be a surprise, when I say that there are so many ways that, again, we can derogate from this kind of, risk principle that is there. Because as soon as something becomes designated, you know, a technology for the use of national security, issues, then the framework doesn't, doesn't hold anymore. Right?

Petra Molnar:

And so it just allows for technology to be kind of pushed into the border space and in migration under the guise of national security without the act and its kind of frameworks applying. And not only is this, you know, a European issue. Right? But the thing is the AI Act could have been a real, push to call for governance and and regulation of border technologies globally. Right?

Petra Molnar:

Because it is the first regional attempt to govern AI, and it sets the precedent. And so if the AI act is weak, it just there's no incentive for the United States, for Canada, for Australia to regulate. This happened, you know, with the general data protection regulation, the GDPR in the EU, right, which set a really strong precedent. Not that it's perfect either, but it forced other jurisdictions to think about data protection, in their own, you know, kind of behind closed doors in their in in their own jurisdictions. And, unfortunately, the AI act just will not have that same power.

Petra Molnar:

And even though there are some conversations happening, you know, in Canada and and even in the US, You know, the recently kind of released, executive order on AI that the Biden administration came out with, was silent on border tech. It didn't mention borders once. Right? So, again, it's not a priority, and I think it's actually deliberate. Right?

Petra Molnar:

Because leaving borders as a testing ground is useful for the state because it allows the state to experiment in ways that just won't be able to to do, you know, when it comes to other other spaces of public life, for example.

Jack Sanker:

Would be broadly unconstitutional and, violating a lot, yeah, a lot of rules and laws. Yes. It's Well,

Petra Molnar:

and like you said,

Danessa Watkins:

not just state, but, these private companies, like you mentioned in, the walls have eyes I forget what company it was, but setting up in Israel, where, you know, they've made 1,000,000,000 of dollars, testing different technologies out on the Palestinian population, and then, sell that technology to, you know, other other states to use however they they need to. Some of the just to because I know we say AI and it just sounds, you know, what is AI? But I just wanted to give a couple of examples that you brought up, which I think we as lay people can understand the issues with. One thing you addressed was the the voice printing of, cross matching somebody's voice with a database of accents. And you raised the issue of, you know what, dialects can change under different circumstances.

Danessa Watkins:

And that one just hit me because I just came back from my native state of New Hampshire. And when we were there, my husband was like, you're suddenly dropping your r's, and you're speaking in a totally different way when you're with your family. But it's true. The circumstances, the subject matter, you know, have you seen some of that, play out, I guess?

Petra Molnar:

Absolutely. And, you know, thanks thanks for bringing that up because that is such a visceral example that I think a lot of people can identify with. For me too, you know, English is not my first language, and I find that, you know, when I'm tired, for example, my accent comes up more. Or if I'm in different spaces, we code switch, right, as well. And so our accents change, based on who we speak with.

Petra Molnar:

You know? And it's just it's such a shortsighted way of trying to understand just human beings. Right? I mean, this voice printing program was used by Germany and its asylum system, for example, and it would have somebody speak, you know, into this program, and then it would be assessed and the program would say, oh, well, this person is likely from Daraa province in the south of Syria because of their accent and not Aleppo, a big city. But they said that they're from Aleppo, and therefore, they must be lying, and therefore we need to assess their credibility in a different way.

Petra Molnar:

Right? But I think this is the problem with with AI. It's like human beings struggle with the complexities of of just human behavior. Right? It also makes me think of, these other projects that I find really disturbing, these pilot projects that basically created a so called AI lie detector to be tested out at the border.

Petra Molnar:

And this AI lie detector, would use facial recognition or microexpression analysis to be able to make a determination about whether or not someone's telling the truth when they're avatar. But you know, again, you know, I used to practice refugee law and, right, if there are listeners in in, who are listening in, this might be familiar. Like, I've had people that I was representing, maybe act in a way that a judge was having trouble understanding or they weren't making eye contact with a judge of the opposite gender because of religion, because of their experiences, maybe because they were nervous. Right? Or what about the impact of trauma on memory and the fact that we don't tell stories in a linear way anyway, let alone some of the most difficult things that we have to talk about?

Petra Molnar:

Like, human decision makers struggle with this and make very problematic assumptions about credibility, plausibility, truthfulness. That is the starting point of any refugee assessment. Right? And so if human decision makers do that, what will partially or a fully automated system be able to do? One that's predicated on a very biased and discriminatory world that we live in already.

Petra Molnar:

Like, to me, it's just so troubling that we are seeing these AI technologies, whether it's voice printing or AI lie detection or anything in between, kind of forced into the migration space without even like, have did they talk to a refugee lawyer or a refugee themselves, right, before piloting some of this? Like, to me, this is really disturbing.

Jack Sanker:

And we can't we can't even, you know, post hoc go and look at, what those inputs are, because in a lot of those cases, as you talk about in the book, it's classified. It's it's secretive. It's and there's no way for anyone to look into it and decide whether this stuff even works. And, I mean, all all of these are operating under the assumption that, when applied correctly, this algorithmic decision making process will be accurate, and I think even that is extremely presumptuous. I I I tend to be and sometimes the show tends to be a little, bearish on AI and tech and things of that matter.

Jack Sanker:

And this is the assumption that this stuff, like, could even work, is, I think, a one that requires a lot of, faith. And I I don't know that it's, I mean, even even setting aside, like, the the the biases that are going to be inherent, you know, coming from someone who's the biases gonna be that are gonna be inherent in setting up these algorithms and these processes and policies. I just don't know that they're effective enough for anyone to rely on them, and on these, you know, important decisions that are being made on the border in in our immigration process.

Petra Molnar:

Yeah. For sure. I mean, I think I think that has to be the starting point. Right? Like, does this stuff even work?

Petra Molnar:

Some of it is snake oil. Right? Like like, the AI law detector was debunked to something that's just not even working. But that's the disturbing part. Like, that it's just kind of thought of, well, this is a good idea try out at the border again, because it's opaque space, it's a discretionary space.

Petra Molnar:

There's not a lot of accountability or oversight or anything like that. And so I think it's, yeah, it's actually about these bigger questions. Like what kind of world are we building and are we okay with these technologies in the first place when we know that a lot of them don't even work? Not to then mention, it's also about the kind of direction of travel, so to speak. Right?

Petra Molnar:

Because they're always aimed at marginalized communities or people who don't have access to the same level of rights or even legal representation sometimes. Like, we could be using AI to audit immigration decision making, right, or to root out racist border guards. Like, that's a choice. Right? But instead, AI is positioned towards people on the move or refugees or people who, you know, oftentimes, again, sometimes don't even know that AI and automated decision making is being applied in their case.

Petra Molnar:

This happened in that AI and automated decision making is being applied in their case. This happened in Canada where lawyers after the fact, when they got their hands on some evidence of refused, for example, visa applications, saw the decisions, and they were like, hold on. A human did not write this. Right? But if, again, English is your 2nd or 3rd or 4th language and you don't have a lawyer and you have no idea, like, you're not gonna assume that AI is making the decisions.

Petra Molnar:

Right?

Jack Sanker:

Right.

Petra Molnar:

It's just the whole thing is is really, really troubling because so much of it happens behind closed doors and and even lawyers sometimes don't know what's going on.

Jack Sanker:

That's such a good point. If it was so reliable and so trustworthy, then then it then you could then you would see it applied to, to audit, judicial decisions. And people that actually, like, have an opportunity to push back against it, you would see it applied against them, but you don't. You only see it applied against folks that have, you know, no ability to challenge it.

Petra Molnar:

Yeah. That's right. And it's not just even in the immigration space. Right? But we're seeing similar incursions in criminal justice with predictive policing, right, and sentencing algorithms, welfare algorithms, child protection algorithms.

Petra Molnar:

It's not an accident, right, that it happens kind of in marginalized spaces rather than in spaces of power.

Jack Sanker:

Right. You're not seeing it against, like you're not auditing, like, prosecutors or judges or things of that sort. Yeah. Just the folks that can't defend themselves.

Petra Molnar:

Yeah. Yeah. And on one hand, maybe we should be calling for that. Right? But then it also then normalizes tech in those spaces too, which maybe isn't good thing either.

Danessa Watkins:

But those are the people that would probably, you know, have platforms to to speak out against it. I could just imagine the, you know, the boardroom of the people that are collecting the data on this. And I'm sure the assumption is, oh, wait, you know, 60% of these people were found lying. That means it's working. You know, like they're that's how they're doing the data.

Danessa Watkins:

Like the goal is to keep people out. So if it's doing that, then it's working. Yeah. It's scary.

Petra Molnar:

Well, that's exactly it. I think, you know, what you just said is really key. It's it's about paying attention to the priorities and whose priorities actually set the stage. Right? And if the priority is exclusion and keeping people out and finding a technical solution to a complex societal problem, then there you go.

Petra Molnar:

This is what kind of drives, again, this border industrial complex. Like it's not an accident that we're seeing Robodogs and AI lie detectors and drones because they're seen as a solution to a problem that has been created by powerful actors who say that people on the move are the problem.

Danessa Watkins:

Right. Well and I I do love the way that you wrap up your book, Petra, by just, I just wanna quote one part, from one of your last chapters, people on the move often have to prove that they are somehow worthy of our respect. That line stuck with me. And then you end with this, story from Zaid Ibrahim. And he says, what was our crime that we fled from Syria?

Danessa Watkins:

You know, it's like we, we kind of put refugees in this other category, but it's so, this changed my whole mindset of no, that could be me. And, you know, what is my crime? Fleeing somewhere where my family can get shot in the streets, or we don't have food, or we don't have access to medicine? It's, you know, we need to and I think your book does a good job of this of humanizing. These are people, and, you know, what's being used against them is not right, and it wouldn't be allowed in, you know, our our day to day lives.

Danessa Watkins:

So why are we allowing it in these spaces?

Petra Molnar:

Yeah. And, you know, that that was really my whole goal with, with this book and indeed with, with the whole corpus of my work is to, to humanize the situation and and to kind of try and, at least, in some small way, fight against this kind of dehumanization narrative that that presents people on the move as threats, as as just someone out there that that, you know, people cannot relate to. But what you just said, I think is really key. I mean, this can happen to anybody, and it has in fact historically. Right?

Petra Molnar:

Conflict happen and and people have to flee. And again, people don't choose to be refugees and we would do anything to save our children or our families, right? If you know that, you have to make some desperate decisions and either cross But I think we can't lose sight of that, that there are real human stories at the center of all of this, and and I think that really is hopefully the takeaway, of the book.

Jack Sanker:

You describe yourself as a reluctant lawyer, but, one way that we can tell you're an authentic lawyer is the the PDF of the book that you sent, is, dot final, which is, like, the classic, like like like, the, like, 10 drafts later, like, you know, this is dot final, like, all caps.

Petra Molnar:

You know, I just wanna thank you both for, like, really engaging with the book. Not everybody does that, and and I'm really grateful for that.

Jack Sanker:

Well, thanks for writing a good book. Not everyone just

Petra Molnar:

had it. I thought it was pretty nice. Some

Jack Sanker:

people write really boring books and then we're stuck.

Petra Molnar:

Well, you know, I will say, I really like, from the get go, I knew I this was not going to be an academic book. Like, yeah, it's okay. It's undergirded by years of research and analysis or whatever, but, like, I wanted it to be something that everybody could pick up. And I have to say that the biggest, I think, most proudest moment of my career is when, we found out that it was available at Target.

Danessa Watkins:

You know? Like, to me,

Petra Molnar:

I'm like, this is great. Well

Danessa Watkins:

So for our listeners, where, where can they find you on socials? Where can they find your book? Give some information because I hope that our listeners will pick it up.

Petra Molnar:

Sure. Yeah. I mean, if you're interested in finding out more about the work or about the book in in general, I have a website, just my name, petramullnar.com. I'm also on Twitter, and the book is available anywhere the books are sold, whether that is a small local bookshop, hopefully, or Amazon, or Target. It's available.

Jack Sanker:

Thank you so much. I mean, first of all, thank you for writing the book, irrespective of whether you're gonna be on the show or not. It's a tremendous work that you spent a ton of time and effort and put yourself in harm's way quite a few times. It's, an excellent read. Highly recommend to anyone who's looking to learn about this topic, but also to be entertained.

Jack Sanker:

It it's it's got a really fantastic narrative as well. Thank you for spending the last, you know, hour and 10 minutes talking with us about this. And, for those of you that wanna learn more, check out Petra's website, Twitter account, buy the book.

Danessa Watkins:

Yeah. We can't wait to to see when you're quoted, you know, at the international level for the reason why they start looking at these policies more seriously and changing things. So we appreciate your work and and all your time.

Petra Molnar:

Thank you so much for having me.

Danessa Watkins:

Alright. Well, that's our show. We wanna say a huge thank you again to Petra Molner for joining us. Such an interesting topic and certainly one that I think everyone needs to pay attention to, given that these technologies are now turning mainstream. So hope you pick up the book and enjoy it as much as we did.

Danessa Watkins:

Again, we will be back every 2 weeks. You can listen to Litigation Nation wherever you get your podcast, and we will provide links to access, the Wallace Hub I's as well. We'll see you next time. Thanks.

The Legal and Ethical Implications of Border Technologies and Human Rights with Petra Molnar - Ep. 55
Broadcast by