Watch: IDF Colonel Explains Use of AI to “Identify” Targets


Previously: The Guardian Doxes Top Jewish Unit 8200 Spy Chief

It should just be expected that AI is being used to do everything at this point.

The fact that the Jews are claiming they are using AI to target civilians in Gaza should be viewed as a pretense for them to later claim that they are not responsible for various atrocities they commit.

“Oh, sorry – that must have been a mistake with the AI.”

The Guardian:

A video has surfaced of a senior official at Israel’s cyber intelligence agency, Unit 8200, talking last year about the use of machine learning “magic powder” to help identify Hamas targets in Gaza.

The footage raises questions about the accuracy of a recent statement about use of artificial intelligence (AI) by the Israeli Defense Forces (IDF), which said it “does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist”.

However, in the video, the head of data science and AI at Unit 8200 – named only as “Colonel Yoav” – said he would reveal an “example of one of the tools we use” before describing how the intelligence division used machine learning techniques in Israel’s May 2021 offensive in Gaza for “finding new terrorists”.

Let’s say we have some terrorists that form a group and we know only some of them,” he said. “By practising our data science magic powder we are able to find the rest of them.”

The descriptions in the video of technology used by Unit 8200 bear similarities with recent testimony from six IDF insiders about their use of an AI tool called “Lavender” during its offensive on Hamas. They said that the AI-generated database had been used to assist intelligence officers involved in the bombing campaign in Gaza, helping identify tens of thousands of potential human targets.

See: Israel Uses AI System to Identify Targets, Gives Soldiers Permission to Kill Civilians

In its rebuttal, the IDF said some of the accounts were “baseless”. However, the accounts are consistent with the remarks by Yoav during an AI conference at Tel Aviv university in February last year. The video, in which Yoav can be heard talking but not seen, was hosted on the university’s YouTube channel, and until recently it had fewer than 100 views.

When using AI to predict whether someone is a terrorist, he explained, Unit 8200 takes information it has about people it believes are members of terrorist groups and aims “to find the rest of the group”.

Referring to a specific example, the official said that in the IDF’s May 2021 military operation in Gaza, his department applied this principle to “find Hamas squad missile commanders and anti-tank missile terrorists in Gaza in order to operate against them”.

He explained that using a form of machine learning – known as “positive unlabelled learning” – “we take the original sub-group, we calculate their close circles, we then calculate relevant features, and at last we rank the results and determine the threshold.”

The colonel said intelligence officers’ feedback is used “to enrich and improve our algorithm” and stressed that “people of flesh and blood” make decisions. “Ethically speaking we put a lot of emphasis on this,” he said, adding that “these tools are meant to help break their barriers”.

According to Yoav, Unit 8200 was able to break “the human barrier” during the May 2021 offensive when it managed to produce more than 200 new targets. “There were times when this amount took us almost a year,” he said.

There is no reason they would not be using AI. Of course they are using AI. They don’t have a bunch of people sitting and watching the satellite feeds to try to figure out where people are. AI can very easily track all of that.

However, the Jews are running the AI, and they are telling it to target civilians.

They should not be able to squirm out of these murders by blaming the AI. The AI only did what they told it to do.