Israel is using AI systems “Lavender” and “Where’s Daddy?” to kill Palestinians in Gaza, an investigative report by +972 Magazine has revealed pic.twitter.com/9HXHScKeRN
— TRT World (@trtworld) April 5, 2024
Well, this is embarrassing.
Unless it was done on purpose for some reason, which is possible.
The identity of the commander of Israel’s Unit 8200 is a closely guarded secret. He occupies one of the most sensitive roles in the military, leading one of the world’s most powerful surveillance agencies, comparable to the US National Security Agency.
Yet after spending more than two decades operating in the shadows, the Guardian can reveal how the controversial spy chief – whose name is Yossi Sariel – has left his identity exposed online.
The embarrassing security lapse is linked to a book he published on Amazon, which left a digital trail to a private Google account created in his name, along with his unique ID and links to the account’s maps and calendar profiles.
The Guardian has confirmed with multiple sources that Sariel is the secret author of The Human Machine Team, a book in which he offers a radical vision for how artificial intelligence can transform the relationship between military personnel and machines.
Published in 2021 using a pen name composed of his initials, Brigadier General YS, it provides a blueprint for the advanced AI-powered systems that the Israel Defense Forces (IDF) have been pioneering during the six-month war in Gaza.
An electronic version of the book included an anonymous email address that can easily be traced to Sariel’s name and Google account. Contacted by the Guardian, an IDF spokesperson said the email address was not Sariel’s personal one, but “dedicated specifically for issues to do with the book itself”.
The security blunder is likely to place further pressure on Sariel, who is said to “live and breathe” intelligence but whose tenure running the IDF’s elite cyber intelligence division has become mired in controversy.
Unit 8200, once revered within Israel and beyond for intelligence capabilities that rivalled those of the UK’s GCHQ, is thought to have built a vast surveillance apparatus to closely monitor the Palestinian territories.
…
One section of the book heralds the concept of an AI-powered “targets machine”, descriptions of which closely resemble the target recommendation systems the IDF is now known have been relying upon in its bombardment of Gaza.
See: Israel Uses AI System to Identify Targets, Gives Soldiers Permission to Kill Civilians
Over the last six months, the IDF has deployed multiple AI-powered decision support systems that have been rapidly developed and refined by Unit 8200 under Sariel’s leadership.
They include the Gospel and Lavender, two target recommendation systems that have been revealed in reports by the Israeli-Palestinian publication +972 magazine, its Hebrew-language outlet Local Call and the Guardian.
See: AI Called “The Gospel” Tells Israel What Part of Gaza to Bomb, The Guardian Reveals
The IDF says its AI systems are intended to assist human intelligence officers, who are required to verify that military suspects are legitimate targets under international law. A spokesperson said the military used “various types of tools and methods”, adding: “Evidently, there are tools that exist in order to benefit intelligence researchers that are based on artificial intelligence.”
…
In one chapter of the book, he provides a template for how to construct an effective targets machine drawing on “big data” that a human brain could not process. “The machine needs enough data regarding the battlefield, the population, visual information, cellular data, social media connections, pictures, cellphone contacts,” he writes. “The more data and the more varied it is, the better.”
It appears that a big purpose of this slaughter in Gaza is to train this system further.
They will use it on other populations in the future.
They will sell it to the American Jews to use against us.
Now that this guy is burned, they will probably just put an AI in charge of the spy group.
Israeli journalist Yuval Abraham reveals how Israel uses AI technology “Lavender” despite a high error rate that results in a massive civilian casualties.
Abraham said that according to sources who have used the system, it marked 37,000 Palestinians in Gaza as “low-level”… pic.twitter.com/Q2n95TtUt6
— Suppressed Nws. (@SuppressedNws) April 5, 2024