Gaza: AI Targeting a Cover for Genocide


Shir Hever discusses investigative work by the Israeli/Palestinian magazine +972, which exposed the use of AI targeting to justify the Israeli bombing of apartment buildings and hospitals.


Paul Jay

Hi, I’m Paul Jay. Welcome to theAnalysis.news. In a few minutes, Shir Hever will join us again to discuss the current situation and the Israeli onslaught against the Palestinian people in Gaza and the West Bank, and specifically the use of artificial intelligence to target Hamas leaders militants without much care for how many civilians get killed. In fact, some people argue killing civilians might even be part of the objective. I’ll be back in just a minute.

The magazine +972, which is an Israeli Palestinian news magazine published in Israel, and as I say, with Palestinian journalists, published an article very recently about the use of artificial intelligence to target Hamas leaders militants in Gaza. It was not only about the use of AI, but the willingness, and policy even, to not care much about how many civilians are killed in the course of targeting the Hamas leaders and a great reliance on using AI to target an apartment building, school, or a hospital, with maybe 20 seconds of checking whether AI is actually telling you anything that makes any sense.

Now joining us to discuss the article and what this means, and also the significance in terms of the South African case of accusing Israel of genocide is Shir Hever. Shir was born in Israel. He now works for BDS in Germany and is, by training, a political economist. Thanks for joining us, Shir.

Shir Hever

Thanks for having me, Paul.

Paul Jay

So tell us what’s in the article, and then we can talk about what this means.

Shir Hever

Yeah, this is an article by Yuval Abraham, who is a Jewish Israeli working for +972 Magazine, a very serious investigative journalist who has received confidential information from anonymous sources. These anonymous sources are high-ranking officers in Israeli intelligence. He has put a picture of how the targeting acquisition mechanism works for this particular attack on the Gaza Strip, which amounts to the crime of genocide. I guess we’ll get to the point of why it is relevant to talk about genocide in this context.

What he found out from the testimonies of these officers is that the Israeli normal mechanism of acquiring targets by using military intelligence units that identify so-called desirable targets, which means some high-ranking militant or asset belonging to Hamas that they want to destroy or kill, then making a certain calculation of what would be the collateral factor for that attack. The collateral factor means how many civilians are going to be killed for the sake of killing one person that you want to assassinate. Normally, they would have a collateral factor limited to about five. So, five civilians are okay to kill in order to kill one person who is suspected not convicted in a court of being a member of Hamas.

This ratio has been used in previous wars. This time, they are using a different formula. First of all, the officers are saying that they are horrified to discover that the ratio has risen to almost 100, meaning that an entire apartment building can be destroyed. Sometimes, the objective goal of destroying this building is not to kill some high-ranking Hamas officer but to cause panic, dismay, and suffering among the civilian population as a way to pressure Hamas in a very tried, tested, and always failed method of colonial violence.

Paul Jay

When you say officer, to clarify a bit, the journalist from +972 states that he has talked to intelligence, military officials, or soldiers who are involved in the targeting. He’s talking to people with direct knowledge of what’s happening. I should add to this: he was interviewed on CNN, which, to their credit, gave him a fair amount of time to explain what was happening. He is quite a credible journalist.

Shir Hever

Yes. He also gave an interview to Democracy Now. His position is very moral and very ethical. He’s focusing on the needless killing of civilians, and he’s horrified by it, and for very good reason. 

I think we should pay attention to something that came out of this article that didn’t receive enough attention, in my opinion, which is the fact that this is the first time in history that artificial intelligence has been weaponized and used as a weapon of war. You have entire units of intelligence officers who used to produce about five to six targets per day, mainly these five or six Hamas officers or militants that they wanted to kill, and established a certain collateral factor for each target. Now, there is an artificial intelligence tool that is manufacturing more than 100 per day. This is what is enabling the Israeli military to carpet bomb the Gaza Strip.

This is, first of all, unprecedented. But we also have to understand how this technology works. Artificial intelligence is sometimes seen as some kind of black box, something that we’re not able to understand. I do think that for our very safety, for our very understanding of what is happening to modern warfare, we need to know. We need to know how this works.

The way that artificial intelligence allegedly works, what the Israeli military is claiming while trying to promote this as a product, is to say that the artificial intelligence is using facial recognition software in order to go over thousands of pictures and videos from drones and surveillance cameras in order to get an analysis of every centimeter of the Gaza Strip. Then they can say, here we have identified a certain target, and we’ve also identified everyone around that target as a way to know how many civilians are there and what’s the collateral factor. A very important factor to this is that they also claim to be able to assess how many possible Israeli hostages are in the area. That’s very important that they say with facial recognition, we will be able to avoid accidentally bombing Israeli hostages in the Gaza Strip. This is the product they’re selling.

Now, what we’re really seeing on the ground is something completely different. What we see is, in fact, an artificial intelligence model, which is very similar to Chat GPT. In this language model, because it has some kind of conversation with the officer, the artificial intelligence gathers these pictures and creates a target, but then it begins a process of teaching itself. That’s the whole idea of machine learning of artificial intelligence, that is teaching itself to see what kind of target would be more convincing for the soldier to squeeze the trigger. There’s a soldier sitting on a cannon or guiding a fighter plane to bomb a certain area, and the soldier receives the target from artificial intelligence and has to make a decision, yes or no. That’s those 20 seconds that you talked about. Sometimes, it’s less than 20 seconds, according to the testimonies of those soldiers.

Basically, artificial intelligence teaches itself how to condense the information in the most abbreviated form and in the most convincing form so that the soldiers don’t bother reading everything and squeeze the trigger right away. In a way, the manipulation here is on the Israeli soldiers themselves. They are the weapons that are being utilized by artificial intelligence to kill more people in Gaza. This is really a very dangerous development.

Paul Jay

If I’m understanding the technology correctly, AI does not have X-ray vision. They are going based on some photographs and some radar, but it’s probabilities. What they’re really feeding whoever’s going to actually fire, the soldier that’s going to fire, is that there’s a probability that so and so is in this building. It’s not like there’s some direct evidence, necessarily, maybe sometimes, but if you’re having so many targets, it’s mostly probability. Based on just probability, they’re willingly killing hundreds of people in these attacks, each one of them.

Also, anyone that’s worked with AI just on a text basis, and I’ve done quite a bit, it’s amazing how accurate it is most of the time, but it’s also amazing how often it makes shit up that’s completely, utterly wrong. In fact, they have a term for it in the AI world called hallucinating. AI tends, once in a while, to out and out hallucinate stuff that has nothing to do with reality, and it’s being relied on for targeting.

Shir Hever

It does have one thing to do with reality, and that’s the whole point because the way that artificial intelligence has been programmed is to study by interfacing with the user. If the AI comes to the conclusion that telling you something you want to hear, something you want to see, will create positive feedback, then the AI is more likely to go in that path. If it’s an uncomfortable truth that the AI is supposed to tell you, you notice that if you talk to Chat GPT, it will try to avoid giving you an uncomfortable truth. If you try to tell Chat GPT, I’m looking for a certain book on a certain topic. If that book doesn’t exist, you’re not going to get that answer. Chat GPT will invent a book to give you what you want to hear, even if that book was never written.

Now, this is exactly how it works with the bombing of Gaza because, as you say, AI doesn’t have X-ray vision. Theoretically, the soldiers can vet all of the pictures that the AI is using to make a decision to create a target. But as Yuval Abraham, the author of this article, says, the soldiers eventually, because they lack patience, because they don’t want to go through this very tedious process of vetting each and every picture, they end up just checking the gender of the main target. If it’s a woman, they don’t shoot because they don’t believe that it’s a Hamas fighter. If it’s a man, they shoot without checking anything further.

Now, this is something that would teach the AI to always show a picture of a man. That’s how you teach the AI to only show pictures of men. And that is why if there is a man somewhere in the radius of the explosion, that’s what the AI will focus on, and that’s how the soldiers can be convinced.

Now, I want to tie this to the issue of genocide because there’s a lot of debate in the legal world about why South Africa chose the crime of genocide, which is such a serious crime, as the focus of their lawsuit at the International Court of Justice. From a moral and ethical point of view, I think, of course, they were correct because this is what it is. From a legal point of view, from a strategic point of view, you could say this is a crime that’s difficult to prove. I do think that one of the most important issues about proving genocide is that societies that cross this red line, from waging war to committing genocide, have to go through a process of getting their own soldiers to cross that red line. That is one of the most difficult things.

If it was in Rwanda, where the Hutu have consistently called the Tutsi cockroaches in order to dehumanize them and to get the soldiers to not see them as human beings because that’s the only way you can get the soldiers to kill indiscriminately civilians. The Nazis, of course, had very elaborate mechanisms of dehumanizing Jews, dehumanizing Sinti and Roma as a way to get the soldiers to obey orders and to commit genocide. It’s very difficult. It’s much easier to convince your soldiers to defend your homeland in battle than to go around killing soldiers.

This is really the reason that Israel needs artificial intelligence because from the point of view of the soldiers, they are getting a target, and they’re making an educated decision based on data that they’re getting from the AI. But if you go to Gaza and you look on the ground as the reporters in Gaza who are dying every day but nevertheless continue to report our recording, this is just indiscriminate carpet bombing because you have these hundreds of soldiers, each one of them thinking that he’s unique and just got the best target, squeezes the trigger again and again and again. I’m hearing reports from the Israeli artillery units. They have these M-107 cannons, which have a rate of fire that allows them to shoot 500 shells, 155-millimeter shells per 24 hours. And that’s what they’re doing. 

Paul Jay

Well, maybe the real point of AI is to give a fig leaf of justification for carpet bombing. In other words, instead of just calling it carpet bombing, we’re claiming we’re targeting, and this is now just collateral damage when we kill civilians when the reality is the objective is to kill a lot of civilians and make Gaza completely utterly unlivable. But you can say, oh, no, there was a Hamas leader in this building. Well, how do you know? Well, AI told us. It’s actually a fig leaf. Sorry, go ahead.

Shir Hever

For whom is this fig leaf intended? When the Israeli team has to defend themselves at the International Court of Justice, AI doesn’t help them. They cannot go to the International Court of Justice and say, “Our AI told us that this is a Hamas leader.” As these intelligence officers told Yuval Abraham many times, this so-called Hamas leader happens to be a guy with a gun, and that is enough. That’s all they can show. That certainly doesn’t justify demolishing an entire apartment building with the people inside it.

Paul Jay

But that is what they’re going to say. What other defense do they have?

Shir Hever

Yeah, but this is not going to help them. It’s not going to work. But for the soldiers, it does work. So, the fig leaf is a manipulation. Absolutely, they are lying. They are using AI in order to manipulate people, but they’re manipulating their own soldiers. 

This is the first war in Israel’s history in which the soldiers are completely banned from contacting their own families and friends back home. This has been now 101 days, or 102-103, depending on when you’re going to broadcast this. During this time, the soldiers cannot call their girlfriends, cannot call their parents, and cannot tell them what’s happening in Gaza. Even more importantly, they cannot hear what people hear back home who can watch the news and follow the situation. It’s not just to prevent the public from knowing what’s happening in Gaza, but it’s even more importantly to prevent the soldiers from knowing that the whole world is watching and calling what they’re doing an act of genocide. Some soldiers who received a little bit of leave, not many receive leave to go and be with their family for a weekend or something, exhibit serious signs of PTSD because the reality clashes completely with what they saw on the ground in Gaza.

Paul Jay

I think there are two other parts to this, which I don’t think get discussed enough. Netanyahu and the Israeli propagandists try to compare what they’re doing to what the British and the Americans did in Germany. The firebombing of Hamburg or the American firebombing of Japanese cities, and then eventually the atomic bomb, which also had a fig leaf of a military target. We know without question that both of these things were done to try to break the morale of either the German people or the Japanese people, which means the civilians were the targets. That is a war crime. So if Israel wants to compare what they’re doing to that, then they’re comparing war crime to war crime. This doesn’t let them off the hook.

Then there’s another even more extenuating piece, if you will, which is the British and Americans vis-à-vis Germany or the Americans with Japan; they were at war with another state. Maybe you can make some argument that the populations of those states, maybe in some perverted way, are targets. Gaza is not a state. Gaza is under occupation. My understanding of international law is you cannot attack the population of a place that’s under your occupation.

The attack against the Israelis on October 7, which was a terrorist and murderous attack, you can use any adjectives you want, I condemn it. This was not an act from a state. To attack the population of Gaza when you’re the occupying power has no basis in international law, as far as I understand it.

Shir Hever

Yeah, well, I’m not the best person to talk about international law. To my understanding, it’s also very much illegal to tell the population of Gaza that you have 24 hours to leave the northern part of Gaza, and everyone who stays behind will be killed. That’s also something that neither the United States nor Great Britain or any of those examples did as part of their fighting, whether it was against Germany or Japan or whatever.

I do think that it is interesting from the point of view, especially the use of the atomic bomb on Hiroshima and Nagasaki. That’s a very good example because the American administration at the time knew that they could not just send a pilot with an atomic bomb to destroy a civilian city. They had to lie to the pilot. They had to lie to the mathematicians, [John] von Neumann and [Oskar] Morgenstern, who later established RAND corporation, and told them, we need you to develop a model using game theory in order to find which targets would be least defended by air defense systems in Japan. They came up with the cities of Hiroshima and Nagasaki, but without being told what was the real purpose of this experiment, of this mathematical exercise. I think the American administration understood you cannot expect people to commit atrocities on their own. You have to lie to them and manipulate them. So that’s a very interesting example here.

When you say Hamas is not a state, a lot of Israelis would say this is just a technical issue because if Hamas is so strong and if we are so afraid and we have to defend ourselves and all that, then we should fight with all our force to survive. This sort of argument, which they tried to use at the International Court of Justice, is based on a fantasy, on a hallucination, as if Hamas can be defeated by killing a lot of children and a lot of unarmed civilians, and that will somehow weaken Hamas. Yet, it doesn’t. In fact, look at the rate of casualties in the Israeli army for 100 days. They’re being killed every day in Gaza by Hamas fighters, not by innocent, defenseless civilians. They’re being killed by Hamas fighters who keep controlling all these tunnels and have access to enough weapons, petrol, and everything they need in order to continue their fighting against the Israeli military.

The Israeli army failed to rescue any of the hostages. They failed to target or assassinate even one of the leaders of Hamas in Gaza. They only assassinated one in Lebanon. But in Gaza, all of this bombing has achieved nothing except killing a lot of civilians. So, that is also a hallucination. That is also a lie. There’s no number of thousands of families who will be trapped under the rubble, dying slowly, and prevented from being rescued by the Israeli military that will make Israel win this war.

Paul Jay

Did the testimony, not testimony, but the statements of the South African attorneys at the trial or the court, was any of that shown on Israeli television? And if yes, would it have any effect on people? Because it was quite eloquent and powerful, what was said there?

Shir Hever

It was not. Sadly, it was not. The Israeli defense was shown. The responses were shown, but not the actual accusations. You hear so many cries of indignation from the Israeli public, from journalists, from politicians calling it blood libel. How can you say that the state of Israel is killing children when they are killing a lot of children? How can you do this to the Jewish people when it was, in fact, almost the second sentence uttered by the legal team of Israel and the International Court of Justice? In fact, the whole convention for the Prevention of Genocide was for Jews and belongs to Jews, and therefore, Israel is above the law and cannot be targeted by this convention. This is the sort of argument you hear on Israeli media.

I am a little bit taken aback and am listening carefully to what I hear on Israeli media, and I’m following very closely. They are actually admitting each and every element in the accusation of the South African legal team. They’re saying, yes, there were calls for genocide, and yes, there was targeting of civilians and use of starvation as a weapon. This is something that you can’t actually see in the Israeli media. If you put all these things together, you’d say, well, did Myanmar commit genocide against Rohingya? Then the Israelis would say, well, absolutely. One, two, three. This is how genocide is defined. But they recognize the one, two, and three that Israel is committing but are not able to reach the same conclusion.

Paul Jay

The AI systems we have been talking about, are they Israeli manufactured and designed?

Shir Hever

I don’t believe so. In fact, there was this index of which countries have the most advanced artificial intelligence technologies in the world. Israel was lagging behind very low, even below the United Arab Emirates, because the Israeli so-called high-tech miracle is really developed as a weapon of oppression against Palestinians. Artificial intelligence is a different technology. They just don’t have it.

Paul Jay

Do we know where they’re getting it from?

Shir Hever

This is a very big question. I had some suspicions that the Nimbus Project, which is a project by Amazon and Google providing cloud services to the Israeli military, might also be providing artificial intelligence services. I haven’t found proof of this yet, so I’m not making the accusation at this point.

Then, the company Palantir, which you may know, is owned by Peter Thiel, a big Trump supporter. It is a company that is already well known for its technology of surveillance and oppression. They announced officially that they had signed a contract with the Israeli military to provide artificial intelligence. They’re not saying that it’s artificial intelligence for the purpose of acquiring targets, but I think that they are right now the prime suspects as to who is providing this technology.

Paul Jay

Okay. All right, thanks very much, Shir. We’ll pick this up again soon.

Shir Hever

Thank you, Paul.

Paul Jay

Thank you for joining us on theAnalysis.news. Don’t forget, if you come over to the website, get on the email list if you’re not on it. If you’re on YouTube, subscribe, and you can always hit the donate button if you’re so moved. Thanks again.



Select one or choose any amount to donate whatever you like

Never miss another story

Subscribe to theAnalysis.news – Newsletter

Name(Required)

Dr. Shir Hever studies the economic aspects of the Israeli occupation of the Palestinian territory. He is the manager of the Alliance for Justice between Israelis and Palestinians (BIP) and the military embargo coordinator for the Boycott National Committee (BNC).

theAnalysis.news theme music

written by Slim Williams for Paul Jay’s documentary film “Never-Endum-Referendum“.  
SUBSCRIBEDONATE

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *