In one case discussed by the sources, the Israeli military command knowingly approved the killing of hundreds of Palestinian civilians in an attempt to assassinate a single top Hamas military commander. “The numbers increased from dozens of civilian deaths [permitted] as collateral damage as part of an attack on a senior official in previous operations, to hundreds of civilian deaths as collateral damage,” said one source.
“Nothing happens by accident,” said another source. “When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed — that it was a price worth paying in order to hit [another] target. We are not Hamas. These are not random rockets. Everything is intentional. We know exactly how much collateral damage there is in every home.”
According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.”
AI only gives you wrong answers convincingly. And should not be a substitute for analysis or intelligence.
That is fucked up beyond all belief.
It’s like that health insurance corp “AI” used to reject applicants. They very likely trained it to reject as many applicants as possible.
The answer may lie in a statement from the IDF Spokesperson on Nov. 2, according to which it is using the AI system Habsora (“The Gospel”), which the spokesperson says “enables the use of automatic tools to produce targets at a fast pace, and works by improving accurate and high-quality intelligence material according to [operational] needs.”
Similarly the people who made this “AI” likely don’t see Palestinians as humans and are perfectly willingy to kill them.