In this world of AI, lots of people still wake up and choose to do genocide
They're not worth your time.
Reading about Meta’s Israel Policy chief the other day reminded me that there are real human racist decisions without which “algorithmic suppression” cannot happen. We are deceived by the idea that our screens are controlled by “algorithms”. Actually, they’re controlled by racists.
Tech executives working with genocidal governments decide which news items we can see on our screens: news editors decide what those items contain. CBC policy, BBC policy, and a revolving door between the Israeli military and American journalism, as Jeremy Scahill mentioned.
That’s why I never saw the point of the supposedly groundbreaking stories about Lavender AI or the Where’s Daddy algorithm for killing people in Gaza. Really? Is an algorithm driving the bulldozer over hundreds of people? Is an algorithm dressing up in dead women’s underwear, playing with dead children’s toys, and taking photos for dating sites?
On today’s EI Livestream, Jon showed some aerial imagery from Rafah of before and after.
It’s not serious to think that any intelligence, artificial or otherwise, was applied in doing “targeting” that led to this destruction. We are talking about how many multiples of the Hiroshima nuclear holocaust’s worth of ordinance have been dropped on Gaza at this point (I think it’s around 4x now and counting). There’s no intelligent algorithm needed, only the sheer depravity to take the decision, the American bombs to drop, and the American planes to deliver them.
Well, that’s not all that’s needed. Because in north Gaza right now it’s a very intimate, hands on holocaust. In the Sde Teiman torture / death camp, it’s a hands on holocaust.
And according to a CNN story and an Israeli media story this past week (the latter written up in the Cradle), doing the genocide is driving the perpetrators crazy - so crazy that they can’t even talk about what they did in direct language, even in the expose. Instead of talking about the trauma of what they did, they talk about what they saw. Instead of talking about what they’ve become, they talk about being depressed and unmotivated. Their army psychologist tries to help them “normalize” the atrocities, but the subject of the CNN story killed himself. Can human compartmentalization be so total that you can commit mass murder and then retain the sense of the preciousness of life that you need to have relationships? Can racial supremacy really get you that far? It might not.
It’s a word I heard my whole life but I only bothered looking it up a few years ago: shibboleth, meaning a word that distinguishes one group from another. Speak for a few seconds and I’ll know from just a few words whether you’re pro- or anti-genocide.
Anti-genocide shibboleths include using the word “genocide” to describe what Israel is doing; using the word “resistance” to describe the people fighting Israel; using the word “martyr” to describe someone Israel has killed.
Pro-genocide shibboleths include calling the victims “terrorists”, phrases like “Hamas embeds in civilian areas”, “Hamas uses human shields”, and repeating the lies about October 7th.
I wrote last year that as far as the pro-genocide and anti-genocide sides go, that as they reject our (facts) and we reject their (lies), that soon “there will be no reason to talk to one another and no desire on either side to do so.” We’re pretty nearly there - not sure about you but I get less and less from watching anti-genocide people go on debate platforms to have it out with the genocidals (usually with a genocidal or fake friend of the anti-genocide movement hosting).
It’s good to keep track of these shibboleths. Knowing them can help: on the anti-genocide side we need to learn to find each other quickly and recognize who’s a waste of time. There isn’t any time to waste. Every day that our efforts fail carries an intolerable cost.
I’m still in disbelief that there are still people on social media platforms telling others it’s not a genocide and voting for red/blue team will make it better. I often feel like an alien dropped in some weird place. And why I appreciate your sit reps with Jon and others just to feel like I’m not going mad.
"That’s why I never saw the point of the supposedly groundbreaking stories about Lavender AI or the Where’s Daddy algorithm for killing people in Gaza. Really? Is an algorithm driving the bulldozer over hundreds of people?"
I recommend this interview by Shir Hever in the analysis: https://theanalysis.news/gaza-ai-targeting-a-cover-for-genocide/
Let me share two critical points from the interview on the function of "AI" systems within the genocidal war machine. These points resonate deeply with the problem of getting people to murder other people on a large scale and the psychological and moral toll on the foot soldiers ordered to commit genocide:
(1) Machine learning systems provide target lists much faster and do not run out of targets (which occurred after a few weeks in previous wars on Gaza). Naturally the military "value" of an AI target designation are much poorer than those compiled by a human intelligence officer whose function the system is designed to simulate. However, this is apparently a feature, not a bug, as it allows the leadership to order what amounts to carpet bombing of highly populated civilian areas without issuing explicit orders to do so (which could get them sued for war crimes) and without directly breaking military protocols and rules of engagement (which could lead to breakdown of discipline).
This is mostly drawn from a previous story in 972 magazine that has more relevant framing than the one linked here by Justin: https://www.972mag.com/mass-assassination-factory-israel-calculated-bombing-gaza/
(2) Machine learning systems in the Israeli military are not trained to pick "correct" targets, they are instead optimized to get a human operator to sign off on operations. Note that the machine learning systems are designed by ex-military intelligence officers (the military to startup pipeline) who are well positioned to design systems to fit in and effectively subvert military intelligence practices. [IDF procedures were already worse than US rules of engagement which failed to prevent massacres in occupied Iraq...].
[Machine learning systems need training data on which to optimize a reward function. Large Language Models receive the beginning of a known text and models are rewarded on relative success on predicting the next word in the sequence . A system thus trained can generate new texts from prompts essentially as a statistical average of previous texts.]
Compiling training data based on the degree of military "success" of a given strike would involve follow-up reporting that Greg Stoker says the IDF doesn't even do at all (cf his recent interview with Justin on the AEP). Instead, the IDF is already sitting on a pile of previous target designations that were overruled by human operators and a pile of target designations that were signed off on.
According to Shir Hever, they simply declared that getting the green light for an operation is itself the goal of the system and trained it to craft target designations (based on text clippings taken from intelligence reports) in whichever way would most likely lead to a human operator waving it through: Since it is hard work to diligently read through the thousands of target designations produced daily by the gibberish machine, human operators tend to decide by title and general framing. E.g. Israeli intelligence officers may get second thoughts and decide to actually read the target designation report (and notice that it is AI gibberish) if the target is a Palestinian woman; so computer generated target designations come to include language such as "target is a known male Hamas militant" wherever possible.
Some of Shir's points necessarily involve conjecture and I have not seen them elsewhere (and he is an economic rather than military analyst). From my understanding of machine learning systems his descriptions sound all too plausible, however. Clearly this dystopian implementation of "AI" greatly exacerbates the grotesque amount of suffering created directly by the bombings; it also is unlikely to "solve" the problem of high suicide rates of the key human cogs in the genocide machine...