Amnesty International released a new report Tuesday entitled “Automated Apartheid,” a follow-up to their bombshell report in 2022, which found that Israel’s treatment of Palestinians amounts to the crime of apartheid under international law. Building on their previous legal analysis, the latest publication exposes how Israeli authorities use technology and surveillance tools to intensify and entrench a system of control over Palestinians, or, in the words of Amnesty’s Secretary General Agnès Callamard, “to supercharge segregation and automate apartheid.”
The 80-page report focuses in particular on the use of facial recognition technologies and networks of CCTV cameras in Hebron and East Jerusalem — the only two cities in the occupied territories where Israeli settlements exist within Palestinian urban areas, and which are experiencing continuing settlement expansion and property takeovers. The report includes descriptions of Palestinians in Hebron encountering a new AI-powered system called “Red Wolf,” deployed at Israeli checkpoints, which scans Palestinians’ faces and determines whether they can pass a checkpoint or be denied entry. It also covers areas in East Jerusalem that have seen an intensification of surveillance networks in recent years, including the neighborhood of Sheikh Jarrah, Damascus Gate in the Old City, and the compound around Al-Aqsa Mosque.
The premier human rights organization takes an unequivocal position against the use of these facial recognition technologies, which it locates within a wider web of mass surveillance throughout the occupied territories. Such methods, Amnesty argues, restricts Palestinians’ freedom of movement, attempts to fragment and control the population, and creates a “coercive environment … which routinely make [Palestinian] lives unbearable, [and] which is aimed at forcing Palestinians to leave areas of strategic interest to Israeli authorities.” The group’s call on Israel to “immediately cease” the deployment of these surveillance systems, as the report states, sits within Amnesty’s larger efforts to “dismantle the Israeli state’s system of oppression and domination against Palestinians.”
In an interview for +972 before the report’s publication, I spoke to Dr. Matt Mahmoudi, a researcher and adviser on artificial intelligence and human rights at Amnesty, about the findings and implications of the ground-breaking report. The interview has been edited for length and clarity.
Can you describe the investigation and its findings?
The report is a joint output that comes on the back of Amnesty’s “Ban the Scan” campaign as well as our campaign to end Israel’s apartheid against Palestinians. We were specifically interested in looking at the ways in which facial recognition had been deployed in Hebron and East Jerusalem. It involved initially identifying some of the hardware that was deployed in these areas by way of images, that were taken by colleagues or partner organizations, especially [on] the paths regularly frequented by Palestinians. In the context of Hebron, we looked at camera infrastructure in and around checkpoints in H2 [the part of the city under full Israeli military control], once we had a sense of the hardware in place and some of the corporate actors that might be playing a role there.
We worked together with both local organizations and our team to do field interviews with Palestinian families, activists, students, and experts who were documenting, being exposed to, and coming face-to-face with the usage of surveillance systems, in particular facial recognition. This as well as Israeli civil society organizations like Breaking the Silence, who extended access to key testimonies from former and current soldiers, who spoke to some of the technologies that we were hearing about through our witness testimonies.
Did you have access to the software that is being deployed by Israeli security forces, such as “Blue Wolf” [a previous, similar surveillance program in Hebron]? Are there new technologies that you discovered?
In Hebron we were particularly focused on some of the “smart city” technologies that have been rolled out over the last few years, such as Blue Wolf, but also the general “Smart City Initiative.” Whilst we were interested in some of the corporate actors and software being sold off the shelf, there was nothing to document the systems that the [Palestinian] communities were speaking to. What we were left with was the implication that most the software we were looking at was potentially being developed in-house by the Israeli security forces.
We could not interrogate the software up close for access reasons or security reasons. We were able to speak to Palestinians across checkpoints at which this [new] facial recognition software called Red Wolf is being deployed. We became aware of it because of how [Palestinians] were speaking to being identified by soldiers they didn’t know, pre-emptively and without producing any identification. Then we came across this testimony from Breaking the Silence that spoke to the facial recognition system being rolled out at these checkpoints.
The soldiers also speak extensively as to how the system is trained to recognize faces if it does not already recognize you. The implication here is that the system works or is highly likely to work together with previous systems that have been deployed in the area, such as Blue Wolf, and that actually the database that was curated by Blue Wolf, which throws up information on Palestinians, is only potentially accessible through the Red Wolf system as well.
So if you’re known, for example, to have been scheduled for questioning or detention, you would be stopped at the checkpoint once you are recognized and held. Or conversely, if you are not recognized, by not having been previously registered, then the checkpoint would not let you through. It would have taken a picture [of you] the moment you walked through the turnstile, and the soldier is then incentivized to grab your ID and match it to the photo that was taken such that in future, the system can learn to recognize you over time.
Can you briefly explain the difference between Red Wolf and Blue Wolf? How do they work together?
Blue Wolf is an app, accessible on phones and tablets, which gives Israeli soldiers instant access to the information collected on Palestinians and stored in the Wolf Pack database. It is also used to conduct mass biometric registration of Palestinians in Hebron; soldiers scan faces with their mobile devices and add them without knowledge and consent of the individuals to a database of exclusively Palestinian profiles.
Red Wolf is a facial recognition system which operates at checkpoints in the H2 area of Hebron. Only Palestinians have to use these checkpoints. Red Wolf cameras scan faces as they enter a checkpoint, and assess them against a database of exclusively Palestinian biometric profiles. Red Wolf uses this data to determine whether an individual can pass a checkpoint, and automatically enrolls any new face it scans. If no entry exists for an individual, they will be denied passage.
Red Wolf expands its database of Palestinian faces over time. An Israeli commander stationed in Hebron has, for example, explained that soldiers are tasked with training and optimizing Red Wolf’s facial recognition algorithm so it can start recognizing faces without human intervention.
We determined that there is a high likelihood that Red Wolf is connected to Blue Wolf and Wolf Pack. This is because these other, larger databases appear to be the main sources of up-to-date images and information about Palestinians available to Israeli authorities, and because Red Wolf appears to pull up similar information.
What you are describing sounds like an intensification of automated, “frictionless,” smart occupation with the rollout of Red Wolf at these checkpoints.
Precisely. Except it’s not frictionless, right? There are two things at play here. First of all, the notion of an occupation is that it is temporary, not permanent: the checkpoints have been made increasingly permanent with permanent staffing, more militarized infrastructure, and now with facial recognition. The second point is the fact that in these very marginal circumstances, Palestinians have relied on either the soldier knowing them, or the soldiers being disinterested in them, to be able to pass through the checkpoint. Now they have to rely on actual recognition or on their own willingness to coercively be conscripted into this facial recognition system in order to pass. So actually, it’s exacerbating friction.
Amnesty’s 2022 report found the Israeli government guilty of the international crime of apartheid, and now this report focuses on facial recognition technologies and “automated apartheid.” How do those two reports relate to the intensification of apartheid?
In our first report, we focus on one of the arsenal of tools through which Israel accomplishes [apartheid] in the occupied territories. We looked at the ways in which the freedom of movement in particular — both under the international covenant on civil political rights and the international convention of apartheid — is restricted.
The ways in which facial recognition reinforces those movement restrictions, by making it even harder for Palestinians to pass the checkpoint, is one way in which we see that particular pillar of apartheid is exacerbated and reinforced. The other aspect, of course, is that the checkpoint is only used by Palestinians, and the facial recognition software plugs into databases consisting only of Palestinian faces. There is a discriminatory nature of it which is in line with how we understand the ways in which apartheid is exacted against a particular racial group.
The other aspect of our argument on how this reinforces apartheid is the way in which the surveillance is part and parcel of the coercive environment that is being experienced by Palestinians, in particular to force them out of areas of strategic interest to the Israeli authorities. For example, in places like East Jerusalem, in the aftermath of the crackdowns on Sheikh Jarrah and the protests against the evictions [by settlers]; the increase of surveillance in places like Damascus Gate; and places of religious significance to Palestinians and Muslims, like Al-Aqsa Mosque. We also paid attention to how surveillance increased in areas like Silwan, where we’ve seen in the last 10 years alone the skyrocketing number of demolitions with illegal settlers coming in, creating settlements under the auspices of the advancement of a biblical archaeology project called the “City of David.”
What we have seen is that, in tandem with those illegal settler activities, surveillance equipment has also increased by Israeli authorities. A number of surveillance cameras that we’ve identified [have] some out-of-the-box facial recognition capabilities, a lot of which we believe are at high risk of plugging into the “Mabat 2000” facial recognition system that is active in East Jerusalem.
What that does is it creates this coercive environment in which, if Palestinians attempt to even think about resisting those [settler] developments, they have to calculate an even higher risk of potentially being arrested, removed, etc. So surveillance around this illegal settler activity begets surveillance, which then begets more settler activity.
It is clear that there is a discriminatory angle to how the surveillance is applied. We have examples in both in Hebron and in East Jerusalem, but especially in Hebron, where activists and families having reported cameras turning inwards toward their homes to disincentive any form of assembly or congregation or family life. So there is certainly a sense among the testimonies in our report that Palestinians were being treated like a military base, in which Palestinians were to effectively keep as quiet as possible. From our previous reporting on apartheid and elsewhere, we have made the claim that the experiences of Palestinians in Hebron and the occupied is a form of collective punishment.
Far from exercising a legitimate security argument, the continued expansion of surveillance in East Jerusalem, an illegally annexed city, by Israel further violates the human rights of Palestinians, and in this way facilitates the expansion of illegal settlements through the digital cementation of its domain of control. This vast expansion of surveillance is illegitimate, and is especially exacerbated owing to the incompatibility of the technologies deployed with international human rights law and standards.
Amnesty International talks about dismantling the apartheid system, and that includes discriminatory systems and attitudes, but also this real physical hardware element.
Absolutely. We’ve seen some of that in our documentation as far as resistance goes — like resisting the [surveillance] infrastructure itself, because it’s a representation of Israel’s alleged security argument in areas like Hebron and East Jerusalem, where they [actually] have no security argument because it’s in a context of [illegal] annexation. The only security argument they have is to provide security for illegal settlers to move out of the area, as it were.
Most read on +972
The infrastructure very much becomes a material manifestation of apartheid in a way that’s both opening up the possibility for resistance, but closing resistance in a way of creating extra risks. This is especially when it comes to the calculus of wanting to engage in resistance and protests, and then having to potentially reel yourself back in because you fear being identified.
Amnesty’s 2022 report led to a more widespread application of the term “apartheid” globally, though, of course, Palestinians had been saying this for years. What do you hope the impact of “automated apartheid” will be, and how can it be used to apply pressure on the Israeli government to dismantle apartheid?
While the increased adoption and global usage of the apartheid framing has provided a vernacular to hold Israel to account for its crimes against Palestinians, this report serves to demonstrate one of the tools it deploys to exact its apartheid policies. By understanding how technology increasingly plays a role in allowing the scaling of violence, we also begin to realize that there’s nothing fundamentally inevitable about the development, sales, and deployment of these tools.
Pressure on the Israeli government is one thing, though perhaps through shedding light on the ways in which corporate actors also work in proximity to apartheid, we can help develop a sense that these sales are repugnant. Not only does facial recognition depend on the active surveillance of an ever-increasing larger number of racialized communities, but it helps bolster apartheid. This cannot stand.
I should mention that two companies we identified within the Old City of Jerusalem and in Hebron’s H1 was HikVision, a Chinese surveillance developer, as well as TKH Security, a Dutch security company. They appear, especially HikVision, to have a growing presence in the area. We’ve reached out to them on several occasions and have yet to receive conclusive responses from either of them. TKH Security responded with a few lines saying that, as of a few years ago, they no longer have a relationship with the distributor we identified. They also said that they have no direct relationship with Israeli security forces, but they have not elaborated on why the relationship with the distributor ended, or how their products have ended up in the area. HikVision have simply not responded.
Let’s be clear that in the last year alone, the apartheid framework as it applies to Palestine has been more widely accepted. We have two Special Rapporteurs who have used the language in speaking to the particular experiences of Palestinians. I think by also drawing attention to how some of the security and technological measures in that front, specifically AI applications, are being used to exacerbate apartheid, can also help get us to fully understand the lived reality under apartheid in a broader sense.
My greatest hope is that people pay more attention to the conditions of apartheid and are outraged by what is happening in these contexts. This is just a sliver of experiences in the report. I wouldn’t dare to think about what the sum total of the area might look like if we were to apply this analysis more broadly.