Back to photostream

Lavender (2024)

Kamera: Zenza Bronica SQ-Ai

Linse: Zenzanon PS 50mm

Film: Ilford Delta 100

Kjemi: Rodinal (1:25 / 9 min. @ 20°C)

 

+972 Magazine: ‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza (Publ. 3 April 2024)

 

See also: Campaign to Stop Killer Robots

 

Richard Medhurst commentary: Israel's Killer Robot (Publ. 7 April 2024):

 

The Israelis are using a killer robot, AI, artificial intelligence, called Lavender. The code name is Lavender. And I want to read to you this exclusive from 972 magazine that contains interviews with Israelis who've used the system to target people in Gaza.

 

The targeting is so broad and abstract and sweeping that you see before you the mountain of corpses in Gaza. It's something out of a movie almost, but unfortunately it's reality for many. Let me read to you from this article.

 

So they start with a book called «The Human-Machine Team». Okay, and why are they talking about this book?

 

They're talking about it because it was written by the commander of an Israeli unit, and this unit - 8200 - it just so happens to be the one that is specialized in matters of cyber warfare, and so this book was written in 2021 and it's a proposal of having an AI system.

 

[* Note: This book was written by Lieutenant Colonel Yossi Sariel (b. 1978 in Haifa). As a 19-year-old in 1997 he was enlisted in the IDF's intelligence division. Sariel has a bachelor's degree in psychology and sociology, a bachelor's degree in Middle Eastern Studies from Bar-Ilan University and a Master's degree in Strategy, Security and Economics from the National Defense University in the USA.

In 2018 Sariel was awarded the 'Eliyahu Golomb Israel Security Award' for an 'Artificial Intelligence project' and an 'Anti-Terrorism project'. (Eliyahu Golomb (1893-1945) was a Zionist terrorist who was one of the founders of the Haganah and the Palmach).

Yossi Sariel was appointed commander of Unit 8200 in February 2021. In May 2021 he published 'The Human-Machine Team' under the pseudonym Brigadier General YS. His identity as the author of the book and the commander of Unit 8200 was revealed by British newspaper The Guardian on the 5th of April 2024 because through his book profile on Amazon he left "digital traces leading to a private Google account containing his name, along with a unique ID card and links to maps and the diary of the profile". Later it was also revealed that in earlier versions of the book Sariel's full name was mentioned. The IDF spokesperson replied in response to the publication [of Sariel's identity] that "the disclosure of the officer's details is a mistake".

Source - Hebrew Wikipedia: Yossi Shariel (b. 1978) ]

 

Well, it turns out it wasn't a proposal, because it actually exists and it's called Lavender. Such a machine, it turns out, actually exists. Okay, now let me read to you what they've written here.

 

So the human personnel often serve only as a rubber stamp for the machine's decisions. They would personally devote about 20 seconds to each target before authorizing a bombing, just to make sure the Lavender marked target is male, okay.

 

So just to reiterate, the computer, which they have fed ridiculous parameters which we'll get to in a second, just finds targets and then the human who is at the computer, just says, «Yeah, okay, whatever» - they just devote 20 seconds before they destroy- not destroy, rather take someone's life, and I should say LIVES, because you'll see now the ‘collateral damage’:

 

So this is done despite knowing that the system makes what are regarded as ‘errors’ in approximately 10% of cases and is known to occasionally mark individuals who have merely a loose connection to militant groups or no connection at all.

 

So, just to be clear, on average, 10% of the people that are being chosen by this Lavender AI have nothing to do with Hamas. And then, of course, we'll look at the other 90%. Now in a second you'll see they also have nothing to do with Hamas.

 

But just to be clear, 10% - I mean, to play with human lives like that is so evil, it really is. You know we're not talking about 10% breakage with cassettes or something or a product, breaking. These are human lives that they're taking.

 

Moreover, the Israeli Army systematically attacks the targets while they're in their homes, usually at night, while their whole families are present, rather than during the course of military activity.

 

As a matter of policy, the Israelis are not only allowing this machine to choose random targets, but they're bombing them in their homes, with the families. As a matter of policy.

 

Why?

 

Because, according to the sources, from what they regarded as an intelligence standpoint - it was easier to locate the individuals in their private houses. Wow, you don't say. It was just easier to find them at home. So we killed their families, you know. And then they call themselves ‘The most Moral Army’ in the World, like they’re a bunch of goody two shoes fairies - I mean, Jesus.

 

They have something else called ‘Where's Daddy. Again, just look at how sick and evil and cynical this is. They have a program inside this artificial intelligence called ‘Where's Daddy’. What is the function of ‘Where's Daddy’? It gives the Israelis a notification when a potential target has reached their home.

 

You know, they've gotten home, so the signal from their mobile device is pinged on the towers and then the Artificial Intelligence takes the information, collates it and gives it to the officers and says: «Oh, look, this guy» - ‘Daddy’, right - supposedly the head of the family - has just arrived home - and THAT is the cue to go and bomb the family.

 

The result, as the sources testified, is that thousands of Palestinians, most of them women and children, or people who were not involved in the fighting, were wiped out by Israeli air strikes, ESPECIALLY during the first weeks of the war, because of the AI program’s decisions.

 

One of them says: «We were not interested in killing Hamas operatives only when they were in a military building or engaged in a military activity.» That's one intelligence officer saying that.

 

«On the contrary, the IDF bombed them in homes without hesitation, as a FIRST option. It's much easier to bomb a family's home. The system is built to look for them in these situations.»

 

Need I say any more? Don't worry, there's more.

 

Now they discuss ‘Junior Militants’.

 

So this is basically about how they are even killing children; 16- well, I mean, 16 is a teenager, right, but basically minors- and they're killing them and naming them ‘Junior Operatives’, as if that kind of automatically makes them valid for murder, and then they talk about how they don't want to WASTE a precision guided bomb on a ‘Junior Operative’.

 

So this 17-year-old is not worth a proper precision guided bomb, so let's just use a DUMB bomb, which causes a larger blast radius and more destruction and basically kills not just this person and their family, but OTHER families and their neighbours, too.

 

So, out of LAZINESS, out of not wanting to use the better bombs, they just use the DUMB bombs and cause more damage and kill more people. And the targets are children or teenagers. I mean, look at this - also, how callous they are, right.

 

They say that «during the first weeks of the war, for every ‘Junior Hamas’ operative that Lavender marked, it was permissible to kill up to 15 - 20 civilians». Permissible, you know? It's like you're sewing, and a bunch of pins fell on the floor.

 

«Whatever, it's 15 or 20 pins, who cares? You know, it's not a big deal. I spilled 15 drinks on the floor, whatever.» These are LIVES, man, these are people, these are human beings. This is acceptable for them, like they're playing God.

 

The sources added that in the event that the target was a ‘Senior Hamas’ official - whatever that means, because according to the Israelis, a Doctor in Hospital is a ‘Hamas official’ with the rank of Battalion or Brigade Commander - the Israeli Army on several occasions authorized the killing of more than 100 civilians in the assassination of a single Commander.

 

So the computer is telling them: «This guy is here, we can bomb him now, but you're going to kill 100 people. Do you want to proceed?»

 

And they say «Yes».

 

You can't blame the computer, because the Israelis fed these parameters in there. If someone else had programmed it; someone who was actually in possession of a Soul, of Morality - they would provide different parameters.

 

But the combination here is LETHAL. You're mixing two lethal things, right, and making the effect of the Israeli War Machine all the more exponential. That's what's going on.

 

You're industrializing and automating what is already a corrupt Army, an occupying Army, an evil Army. I want to be very clear before I give you the other details. I was going to save this for the end, but I might as well say it now.

 

The computer doesn't matter. The Israelis have always, ALWAYS behaved like this. You have to understand this. The Israelis have ALWAYS, ALWAYS, ALWAYS caused disproportionate damage. This is their Doctrine. They believe in this. It's the bread and butter of the Israeli military; a bunch of terrorist groups, literally.

 

The computer is not the issue here, it's the Israelis.

 

Let me continue to the other part. At one point, the Lavender AI marked some 37,000 Palestinians as ‘suspected’ Hamas militants, MOST of them ‘Junior’. Okay, so you do the math.

 

If the Israelis say, for every ‘Junior Commander’ you can kill 15 to 20 people as collateral; do the math. MOST of them, what is most of them. Let's be very conservative and say 51% instead of 90 or 80 or 75. Let's say 51% of 37,000 are ‘Junior Commanders’. Right, that's MOST of them, it's the majority. Multiply that by 15 to 20 and you tell me what the result is. So the reason behind the development of this is that they wanted to.

 

«They wanted to allow us-» - meaning, the people running the machines - «-to attack [the junior operatives] automatically. It's the Holy Grail. Once you go automatic, target generation goes crazy.» And so at one point, look at what they do.

 

They KNOW that the 90% is just a random figure, it's a rubber stamp, because when you look at what is the criteria to tag these people as ‘Junior Operatives’ or ‘Hamas Operatives’, it falls apart.

 

And let me show you something here: Look at the criteria.

 

So, for example: «Similar problems exist with the ability of target machines to assess the phone used by an individual marked for assassination. ’In war, Palestinians change phones all the time. People lose contact with their families, they give their phone to a friend or a wife, maybe they lose it. There is no way to rely 100% on the automatic mechanism that determines which [phone] number belongs to whom.»

 

The sources that are talking to the magazine - they say that «the Army KNEW the minimal human supervision in place would not discover these faults. ‘There was no ‘zero-error’ policy. Mistakes were treated statistically.»

 

«Because of the scope and magnitude, the protocol was that even if you don't know for sure that the machine is right, you know that statistically it's fine, so you go for it.»

 

So they can sleep at night, so they can feel better about themselves, they say, «Well, you know, 90% of them are correctly marked. So you know, let's just go ahead, even if we're not sure who this person is. Let's just do it».

 

One of these soldiers says: «There's something about the statistical approach that sets you to a certain norm and standard. There has been an illogical amount of [bombings] in this operation. This is unparalleled, in my memory. And I have much more trust in a statistical mechanism than a soldier who lost a friend two days ago. Everyone there, including me, lost people on October 7th. The machine did it coldly. And that made it easier.»

 

So this guy's getting everything mixed up because he's saying, «Well, the machine doesn't make mistakes.» But then he admits that it makes mistakes, but because it's a machine, it makes him sleep - you know; feel better and sleep at night.

 

It really says a lot, doesn't it? It really tells you everything.

 

There are more things I highlighted for you.

 

«In practice, sources said this meant that for civilian men marked in error by Lavender, there was no supervising mechanism in place to detect the mistake.»

 

So when there ARE mistakes, no one is said to check it.

 

«A common error occurred ‘if the [Hamas] target gave [his phone] to his son, his older brother, or just a random man. That person will be bombed in his house with his family. This happened often. These were most of the mistakes caused by Lavender.»

 

So, a phone number gets you killed. That's it. Forgive me for being crass, but you know, you're a guy, you have a penis, you're a male and you have the wrong phone number. That's a death sentence in Gaza. This is the what the Israelis are telling us.

 

This is madness, this is evil. This is crazy.

 

And there are other things in here- for example, the fact that most of the people being killed are women and children. So this doesn't make any sense anymore. They tell themselves: «Well, the target is male, so let's bomb them.» But then what is the the end result? The bottom line: women and children are being killed. So the Israelis are basically admitting that, «Yeah, we're fine with this. Whether WE do it or the MACHINE does it, it doesn't matter, really - does it.»

 

This is a very, very long article and I sat down the whole day going through it, adding notes, and I've just given you the gist of it.

 

There's no need for me to go any further, but I'll just add a few last points.

 

The Israelis have ALWAYS behaved like this. This is their policy. They were BRUTAL in ’48, they were BRUTAL even before that, they were BRUTAL in ’56 and then also in ’67 and in ’73. They've ALWAYS been brutal, 24/7. They've basically made this AI just as BRUTAL as them and AUTOMATED the process. That's it.

 

One other point that I would add is that this could be a psy-op.

What do I mean by a psy-op?

 

A psy-op is an abbreviation. It basically means a psychological operation. It's meant to demoralize the other side, right, so this could be the Israelis feeding the journalists a bunch of rubbish. I don't think so, but we're contemplating. This could be the Israelis feeding the journalists a bunch of rubbish about Lavender so that they make Hamas feel demoralized, like, «Oh crap, they're going to kill us with our families, they're going to target us; this machine, this robot killer.»

 

It COULD be, but I'll also add another point.

 

The Israelis always love to pretend that they're Hot Shit, you know, in the tech and electronic domains. What do I mean by this?

 

For example, when they killed a nuclear scientist in Iran [Mohsen Fakhrizadeh (1958-2020)], they then told the New York Times how they smuggled a gun into Iran and then it was an automated gun that killed him. I mean, who the hell knows if that's true? We don't know if that's true.

 

Also, this thing with like, they know how to hack everyone with Pegasus - and yeah, Pegasus is real - no question about that. But the Israelis have cultivated an image about themselves which serves them. It demoralizes people; they can sell cyber weapons, Etc.

 

So you shouldn't buy into it too much. And and where's the proof that they're not as Hot Shit as they say they are? October 7th. They have no clue what hit them. They have no clue what hit them; never saw it coming.

 

All this tech and equipment down the toilet, useless, pathetic. So just keep that in mind, I'll leave it at that. I have no reason to doubt the veracity of this article. I do think it's true. I have no doubt that the Israelis have been developing this for quite a while and I don't think the first time they used it was in Gaza.

 

And you know, I'm not saying AI is bad per se. It's bad when you have bad people feeding it bad parameters. Because if you told the machine: «Hey, I want ZERO collateral damage because I'm not a murdering psychopath» - the machine would just give you no targets that actually have collateral damage.

 

Also, who are the Israelis to kill anyone in Gaza? Who are they? Just because someone's in Hamas? That gives them the right to murder them?

 

Okay, let's take Israel's logic and flip it around. So does that mean Hamas can develop an AI that targets any person in the Israeli Army in their homes? Cuz they didn't seem too Happy on October 7th.

 

And the irony is that you had people saying, «Oh well, October 7th, it was a Peace festival. How could you target these people who are, you know, on vacation, who are having a party outside the concentration camp that you've been put in?» Jesus Christ.

 

I would honestly not talk about the festival because it makes them look like a bunch of disconnected, deranged Psychopaths.

 

But anyway, you know, the idea was that, well, even if you did kill this person who is in the Israeli military, he was PARTYING, okay, and this Hamas person was in their home with their family; that's even worse, that's a MILLION times worse. And it's always been like this; before and after October 7th.

 

Remember, the IRA did not CARE if a British soldier is in his military fatigues, in the barracks or if he's in a pub. He's occupying their country, he's got to go. You can't just steal land with zero consequences, right?

 

This is so arrogant.

 

But anyway, you see the hypocrisy here, it's okay for Israelis to hide behind this; this idea that they're civilians and «Oh, they're just partying», when actually a lot of them are in the military. COMPULSORY. MANDATORY. MEN AND WOMEN. Both of them.

 

Every man and woman in Israel has to go to the military, whether they like it or not. So you can't be in the military and then call yourself a civilian. Which one are you? Are you a civilian or in the military? Fucking pick one.

 

Anyway, we're going off on a tangent here. I just wanted to point that hypocrisy out.

 

This is the equivalent of a Terminator, make no mistake.

5,363 views
13 faves
0 comments
Uploaded on April 8, 2024
Taken in March 2024