https://sputnikglobe.com/20240409/israel-uses-military-ai-in-gaza-tool-of-genocide-or-simply-a-database-1117831884.html
Israel Uses Military AI in Gaza: Tool of Genocide or ‘Simply a Database’?
Israel Uses Military AI in Gaza: Tool of Genocide or ‘Simply a Database’?
Sputnik International
According to Abraham’s investigation based on interviews with Israeli military, Tel Aviv’s special services and the army for several months at least have been letting an AI-based machine to make a decision on the life and death of tens of thousands of people.
2024-04-09T16:30+0000
2024-04-09T16:30+0000
2024-04-09T16:30+0000
world
palestine-israel conflict
middle east
antonio guterres
israel
tel aviv
palestine
hamas
israel defense forces (idf)
gaza strip
https://cdn1.img.sputnikglobe.com/img/07e8/04/09/1117826554_0:68:3072:1796_1920x0_80_0_0_2d9827b3cc3eea3ffe2323ebb7edd4bc.jpg
They were the Gazans whom “the machine” suspected of being jihadists or their relatives. Those whom the AI program codenamed Lavender found suspicious enough were blown up together with the buildings they were in – with huge human “collateral damage.”In several interviews with international media, journalist Yuval Abraham confirmed the worst suspicions about the program.“In the next step, the machine ordered the destruction of the building with people inside it, including children. There was minimal human supervision of AI’s actions – one of my sources said he spent just 20 seconds before authorizing the bombing,” Abraham added.While his investigation cited six high-ranking Israeli officers acquainted with AI use, the Israeli Defense Forces (IDF) is yet to comment on the accusations. An IDF statement did not deny the use of AI in the identification of Israel’s human targets.“The system your questions refer to is not a system, but simply a database whose purpose is to cross-reference intelligence sources in order to produce up-to-date layers of information on the military operatives of terrorist organizations. This is not a list of confirmed military operatives eligible to attack,” the IDF said in a statement published in response to questions from the British daily The Guardian.How Many People Have Been Killed by AI in Gaza? In his interview to Democracy Now, Abraham revealed the grim statistics of Lavender’s operations:“The military knew that approximately 10 percent of the people whom the machine marked to be killed – one-tenth of them were not Hamas militants,” Abraham said in the interview. “Some of them had a loose connection to Hamas, some had completely no connection. One source told me how the machine marked people who had the same name and nickname as another person, a Hamas operative… The military had what they called a collateral damage degree. Another source said that up to 20 innocent civilians could be killed in order to liquidate one low-ranking Hamas activist. For high-ranking officers, say, a Hamas brigade commander, the number could be in triple digits – for the first time in Israeli Defense Force’s history.”The following question is inevitable: who is going to be held responsible for the use of an Israeli AI machine that, according to the officers interviewed by Abraham, marked 37,000 people in Gaza as targets for annihilation?“Essentially, it is not the machine, but the human being that imagines, designs, creates and then triggers such a machine into action – that person is responsible for everything that results,” Bennett noted.Meanwhile, Owda told Sputnik that ultimate responsibility lies with Israel’s special services.The matter is so serious that the UN Secretary-General Antonio Guterres expressed “serious concern” over reports that Israel used AI in Gaza in a special statement – at least, in the first weeks of Israel’s punitive operation there. «No part of life and death decisions which impact entire families should be delegated to the cold calculation of algorithms,» Guterres is quoted by international news agencies as saying.
https://sputnikglobe.com/20240406/strike-on-aid-workers-in-gaza-us-knows-how-to-be-silent-on-israels-violations–analyst-1117776230.html
https://sputnikglobe.com/20240407/even-with-ai-kill-list-israel-no-closer-to-achieving-goals-six-months-into-gaza-war-1117800746.html
israel
tel aviv
palestine
gaza strip
2024
Dmitry Babich
https://cdn1.img.sputnikglobe.com/img/07e8/02/0e/1116778495_0:120:720:840_100x100_80_0_0_9bf47040bc46073fb920d272be7bc29d.jpg
Dmitry Babich
https://cdn1.img.sputnikglobe.com/img/07e8/02/0e/1116778495_0:120:720:840_100x100_80_0_0_9bf47040bc46073fb920d272be7bc29d.jpg
News
en_EN
https://cdn1.img.sputnikglobe.com/img/07e8/04/09/1117826554_248:0:2979:2048_1920x0_80_0_0_58fc653063ab563e5fd9b92740f77641.jpg
Dmitry Babich
https://cdn1.img.sputnikglobe.com/img/07e8/02/0e/1116778495_0:120:720:840_100x100_80_0_0_9bf47040bc46073fb920d272be7bc29d.jpg
gaza genocide, is israel committing genocide, what’s happening with gaza, israeli genocide against palestinians, will israel be charged with genocide, who’s backing israel, un hearing on gaza, current developments in gaza, what’s happening to palestinians, us military aid to israel, us arms supplies to israel, us violating international law, what weapons is israel getting from us, israeli strikes, israel strikes gaza, israel kills civilians
gaza genocide, is israel committing genocide, what’s happening with gaza, israeli genocide against palestinians, will israel be charged with genocide, who’s backing israel, un hearing on gaza, current developments in gaza, what’s happening to palestinians, us military aid to israel, us arms supplies to israel, us violating international law, what weapons is israel getting from us, israeli strikes, israel strikes gaza, israel kills civilians
Tel Aviv’s special services and the army have been letting an AI-based machine decide on the life and death of tens of thousands of people for several months, according to a recent investigation based on interviews with the Israeli military.
They were the Gazans whom “the machine” suspected of being jihadists or their relatives. Those whom the AI program codenamed Lavender found suspicious enough were blown up together with the buildings they were in – with huge human “collateral damage.”
“As we see, the Lavender AI program or some other targeting program the Israelis are using – its purpose is not to destroy enemies – the terrorists, Hamas operatives, etc. Instead, it has been designed to destroy blocks of real estate… In the worst case, it is a bloodthirsty confirmation of the worst kind of secret genocide. The Israeli government is trying to erase the Palestinian people and all memory of Gaza like chalk drawings from the blackboard,” former US State Department analyst Scott Bennett told Sputnik.
In several interviews with international media, journalist Yuval Abraham confirmed the worst suspicions about the program.
“What Lavender does is scan the information on the people inside a given building in Gaza. The artificial intelligence, i.e. a machine, gives every individual a rating (on a scale from 1 to 100) on the likelihood that this individual is a member of Hamas or the military wing of Islamic Jihad group,” he explained in an interview with Manhattan-based alternative news program Democracy Now.
“In the next step, the machine ordered the destruction of the building with people inside it, including children. There was minimal human supervision of AI’s actions – one of my sources said he spent just 20 seconds before authorizing the bombing,” Abraham added.
“The system your questions refer to is not a system, but simply a database whose purpose is to cross-reference intelligence sources in order to produce up-to-date layers of information on the military operatives of terrorist organizations. This is not a list of confirmed military operatives eligible to attack,” the IDF said in a statement published in response to questions from the British daily The Guardian.
Reham Owda, an independent political analyst from Gaza, shared her opinion with Sputnik: “This means that Israel chose the easiest and the cheapest way to target Hamas militants in Gaza without considering the lives of innocent civilians, including women and children. This means that Israel deals with Gaza civilians as if they were numbers or just items that can be removed randomly.”
How Many People Have Been Killed by AI in Gaza?
In his interview to Democracy Now, Abraham revealed the grim statistics of Lavender’s operations:
“The military knew that approximately 10 percent of the people whom the machine marked to be killed – one-tenth of them were not Hamas militants,” Abraham said in the interview.
“Some of them had a loose connection to Hamas, some had completely no connection. One source told me how the machine marked people who had the same name and nickname as another person, a Hamas operative… The military had what they called a collateral damage degree. Another source said that up to 20 innocent civilians could be killed in order to liquidate one low-ranking Hamas activist. For high-ranking officers, say, a Hamas brigade commander, the number could be in triple digits – for the first time in Israeli Defense Force’s history.”
“According to Abraham’s reporting, Israeli officials did not want to ‘waste’ more expensive precision-guided munitions on the many junior-level Hamas operatives identified by Lavender program,” The Washington Post wrote. So, “dumb” bombs were used, which the WP describes as “heavy, unguided weapons that inflicted significant loss of civilian life.”
The following question is inevitable: who is going to be held responsible for the use of an Israeli AI machine that, according to the officers interviewed by Abraham, marked 37,000 people in Gaza as targets for annihilation?
Turkiye vs. Israel: A new trade war?
Tel Aviv rejected Turkiye’s request to send humanitarian aid to Gaza residents on Monday, prompting Turkish Foreign Minister Hakan Fidan to promise a number of measures against Israel in response. On Tuesday, Turkiye imposed export… pic.twitter.com/QQ8x8eeTJQ
— Sputnik (@SputnikInt) April 9, 2024
“Essentially, it is not the machine, but the human being that imagines, designs, creates and then triggers such a machine into action – that person is responsible for everything that results,” Bennett noted.
“Military Intelligence Directorate (Aman) and Israel’s Internal Security Service Shabak are the two security directorates that are responsible for the strikes made with the use of AI. Because their members are responsible for the data, loaded into [IDF’s] computer system… And that system included classification of each Palestinian person by his or her age, education, profession, etc.”
«No part of life and death decisions which impact entire families should be delegated to the cold calculation of algorithms,» Guterres is quoted by international news agencies as saying.