Algorithms and AI Have Turned Gaza into a Laboratory of Death
The revelations by 972 Magazine (https://www.972mag.com/lavender-ai-israeli-army-gaza/) and Local Call have exposed the darkest core of the contemporary war in Gaza, in which genocide is carried out not only by bombs and missiles, but by data, algorithms and global digital platforms.
The Israeli artificial intelligence system known as Lavender has confirmed what the Palestinian resistance, Lebanon, and Iran have denounced for years: Technology as an organic part of the Zionist war machine, functioning as an instrument of surveillance, target selection, and mass extermination.
The liberal rhetoric of “digital privacy” collapses in the face of the facts. Applications such as WhatsApp insist on the promise of end-to-end encryption, but conceal what is essential, in which metadata are worth more than messages.
Location, contact networks, patterns of communication, and group affiliations make it possible to map the social life of an entire people. In Gaza, these data have been incorporated into military systems that turn human relationships into algorithmic criteria for death.
Location, contact networks, patterns of communication, and group affiliations make it possible to map the social life of an entire people. In Gaza, these data have been incorporated into military systems that turn human relationships into algorithmic criteria for death.
Lavender assessed virtually the entire population of the Gaza Strip, comprising more than 2.3 million people, assigning automated “risk scores”. Merely being in a WhatsApp group, maintaining........
