Image: Cath Virginia / The Verge | Photo from Getty Images
Israel’s military has been using artificial intelligence to help choose its bombing targets in Gaza, sacrificing accuracy in favor of speed and killing thousands of civilians in the process, according to an investigation by Israel-based publications +972 Magazine and Local Call.
The system, called Lavender, was developed in the aftermath of Hamas’ October 7th attacks, the report claims. At its peak, Lavender marked 37,000 Palestinians in Gaza as suspected “Hamas militants” and authorized their assassinations.
Israel’s military denied the existence of such a kill list in a statement to +972 and Local Call. A spokesperson told CNN that AI was not being used to identify suspected terrorists but did not dispute the existence of the Lavender…