From xxxxxx <[email protected]>
Subject ‘Lavender’: The AI Machine Directing Israel’s Bombing Spree in Gaza
Date April 6, 2024 1:10 AM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
[[link removed]]

‘LAVENDER’: THE AI MACHINE DIRECTING ISRAEL’S BOMBING SPREE IN
GAZA  
[[link removed]]


 

Yuval Abraham
April 3, 2024
972 Magazine [[link removed]]

*
[[link removed]]
*
[[link removed]]
*
*
[[link removed]]

_ The Israeli army has marked tens of thousands of Gazans as suspects
for assassination, using an AI targeting system with little human
oversight and a permissive policy for casualties, +972 and Local Call
reveal. _

Massive destruction is seen in Al-Rimal popular district of Gaza City
after it was targeted by airstrikes carried out by Israeli forces,
October 10, 2023, Mohammed Zaanoun/Activestills`

 

In 2021, a book
[[link removed]] titled
“The Human-Machine Team: How to Create Synergy Between Human and
Artificial Intelligence That Will Revolutionize Our World” was
released in English under the pen name “Brigadier General Y.S.” In
it, the author — a man who we confirmed to be the current commander
of the elite Israeli intelligence unit 8200 — makes the case for
designing a special machine that could rapidly process massive amounts
of data to generate thousands of potential “targets” for military
strikes in the heat of a war. Such technology, he writes, would
resolve what he described as a “human bottleneck for both locating
the new targets and decision-making to approve the targets.”

Such a machine, it turns out, actually exists. A new investigation by
+972 Magazine and Local Call reveals that the Israeli army has
developed an artificial intelligence-based program known as
“Lavender,” unveiled here for the first time. According to six
Israeli intelligence officers, who have all served in the army during
the current war on the Gaza Strip and had first-hand involvement with
the use of AI to generate targets for assassination, Lavender has
played a central role in the unprecedented bombing of Palestinians,
especially during the early stages of the war. In fact, according to
the sources, its influence on the military’s operations was such
that they essentially treated the outputs of the AI machine “as if
it were a human decision.”

Formally, the Lavender system is designed to mark all suspected
operatives in the military wings of Hamas and Palestinian Islamic
Jihad (PIJ), including low-ranking ones, as potential bombing targets.
The sources told +972 and Local Call that, during the first weeks of
the war, the army almost completely relied on Lavender, which clocked
as many as 37,000 Palestinians as suspected militants — and their
homes — for possible air strikes.

During the early stages of the war, the army gave sweeping approval
for officers to adopt Lavender’s kill lists, with no requirement to
thoroughly check why the machine made those choices or to examine the
raw intelligence data on which they were based. One source stated that
human personnel often served only as a “rubber stamp” for the
machine’s decisions, adding that, normally, they would personally
devote only about “20 seconds” to each target before authorizing a
bombing — just to make sure the Lavender-marked target is male. This
was despite knowing that the system makes what are regarded as
“errors” in approximately 10 percent of cases, and is known to
occasionally mark individuals who have merely a loose connection to
militant groups, or no connection at all.

Moreover, the Israeli army systematically attacked the targeted
individuals while they were in their homes — usually at night while
their whole families were present — rather than during the course of
military activity. According to the sources, this was because, from
what they regarded as an intelligence standpoint, it was easier to
locate the individuals in their private houses. Additional automated
systems, including one called “Where’s Daddy?” also revealed
here for the first time, were used specifically to track the targeted
individuals and carry out bombings when they had entered their
family’s residences.

[Palestinians transport the wounded and try to put out a fire after
an Israeli airstrike on a house in the Shaboura refugee camp in the
city of Rafah, southern Gaza Strip, November 17, 2023. (Abed Rahim
Khatib/Flash90)]
[[link removed]]
Palestinians transport the wounded and try to put out a fire after an
Israeli airstrike on a house in the Shaboura refugee camp in the city
of Rafah, southern Gaza Strip, November 17, 2023. (Abed Rahim
Khatib/Flash90)
Palestinians transport the wounded and try to put out a fire after an
Israeli airstrike on a house in the Shaboura refugee camp in the city
of Rafah, southern Gaza Strip, November 17, 2023. (Abed Rahim
Khatib/Flash90)

The result, as the sources testified, is that thousands of
Palestinians — most of them women and children or people who were
not involved in the fighting — were wiped out by Israeli airstrikes,
especially during the first weeks of the war, because of the AI
program’s decisions.

“We were not interested in killing [Hamas] operatives only when they
were in a military building or engaged in a military activity,” A.,
an intelligence officer, told +972 and Local Call. “On the contrary,
the IDF bombed them in homes without hesitation, as a first option.
It’s much easier to bomb a family’s home. The system is built to
look for them in these situations.”

The Lavender machine joins another AI system, “The Gospel,” about
which information was revealed in a previous investigation
[[link removed]] by
+972 and Local Call in November 2023, as well as in the Israeli
military’s own publications
[[link removed]]. A
fundamental difference between the two systems is in the definition of
the target: whereas The Gospel marks buildings and structures that the
army claims militants operate from, Lavender marks people — and puts
them on a kill list. 

In addition, according to the sources, when it came to targeting
alleged junior militants marked by Lavender, the army preferred to
only use unguided missiles, commonly known as “dumb” bombs (in
contrast to “smart” precision bombs), which can destroy entire
buildings on top of their occupants and cause significant casualties.
“You don’t want to waste expensive bombs on unimportant people —
it’s very expensive for the country and there’s a shortage [of
those bombs],” said C., one of the intelligence officers. Another
source said that they had personally authorized the bombing of
“hundreds” of private homes of alleged junior operatives marked by
Lavender, with many of these attacks killing civilians and entire
families as “collateral damage.”

In an unprecedented move, according to two of the sources, the army
also decided during the first weeks of the war that, for every junior
Hamas operative that Lavender marked, it was permissible to kill up to
15 or 20 civilians; in the past, the military did not authorize any
“collateral damage” during assassinations of low-ranking
militants. The sources added that, in the event that the target was a
senior Hamas official with the rank of battalion or brigade commander,
the army on several occasions authorized the killing of more than 100
civilians in the assassination of a single commander.

[Palestinians wait to receive the bodies of their relatives who were
killed in an Israeli airstrike, at Al-Najjar Hospital in Rafah,
southern Gaza Strip, October 24, 2023. (Abed Rahim Khatib/Flash90)]
[[link removed]]
Palestinians wait to receive the bodies of their relatives who were
killed in an Israeli airstrike, at Al-Najjar Hospital in Rafah,
southern Gaza Strip, October 24, 2023. (Abed Rahim Khatib/Flash90)
Palestinians wait to receive the bodies of their relatives who were
killed in an Israeli airstrike, at Al-Najjar Hospital in Rafah,
southern Gaza Strip, October 24, 2023. (Abed Rahim Khatib/Flash90)

The following investigation is organized according to the six
chronological stages of the Israeli army’s highly automated target
production in the early weeks of the Gaza war. First, we explain the
Lavender machine itself, which marked tens of thousands of
Palestinians using AI. Second, we reveal the “Where’s Daddy?”
system, which tracked these targets and signaled to the army when they
entered their family homes. Third, we describe how “dumb” bombs
were chosen to strike these homes. 

Fourth, we explain how the army loosened the permitted number of
civilians who could be killed during the bombing of a target. Fifth,
we note how automated software inaccurately calculated the amount of
non-combatants in each household. And sixth, we show how on several
occasions, when a home was struck, usually at night, the individual
target was sometimes not inside at all, because military officers did
not verify the information in real time.

STEP 1: GENERATING TARGETS

‘Once you go automatic, target generation goes crazy’

In the Israeli army, the term “human target” referred in the past
to a senior military operative who, according to the rules of the
military’s International Law Department, can be killed in their
private home even if there are civilians around. Intelligence sources
told +972 and Local Call that during Israel’s previous wars, since
this was an “especially brutal” way to kill someone — often by
killing an entire family alongside the target — such human targets
were marked very carefully and only senior military commanders were
bombed in their homes, to maintain the principle of proportionality
under international law.

But after October 7 — when Hamas-led militants launched a deadly
assault on southern Israeli communities, killing around 1,200 people
and abducting 240 — the army, the sources said, took a dramatically
different approach. Under “Operation Iron Swords,” the army
decided to designate all operatives of Hamas’ military wing as human
targets, regardless of their rank or military importance. And that
changed everything.

The new policy also posed a technical problem for Israeli
intelligence. In previous wars, in order to authorize the
assassination of a single human target, an officer had to go through a
complex and lengthy “incrimination” process: cross-check evidence
that the person was indeed a senior member of Hamas’ military wing,
find out where he lived, his contact information, and finally know
when he was home in real time. When the list of targets numbered only
a few dozen senior operatives, intelligence personnel could
individually handle the work involved in incriminating and locating
them.

[Palestinians try to rescue survivors and pull bodies from the rubble
after Israeli airstrikes hit buildings near Al-Aqsa Martyrs Hospital
in Deir al-Balah, central Gaza, October 22, 2023. (Mohammed Zaanoun)]
[[link removed]]
Palestinians try to rescue survivors and pull bodies from the rubble
after Israeli airstrikes hit buildings near Al-Aqsa Martyrs Hospital
in Deir al-Balah, central Gaza, October 22, 2023. (Mohammed Zaanoun)
Palestinians try to rescue survivors and pull bodies from the rubble
after Israeli airstrikes hit buildings near Al-Aqsa Martyrs Hospital
in Deir al-Balah, central Gaza, October 22, 2023. (Mohammed
Zaanoun/Activestills)

However, once the list was expanded to include tens of thousands of
lower-ranking operatives, the Israeli army figured it had to rely on
automated software and artificial intelligence. The result, the
sources testify, was that the role of human personnel in incriminating
Palestinians as military operatives was pushed aside, and AI did most
of the work instead. According to four of the sources who spoke to
+972 and Local Call, Lavender — which was developed to create human
targets in the current war — has marked some 37,000 Palestinians as
suspected “Hamas militants,” most of them junior, for
assassination (the IDF Spokesperson denied the existence of such a
kill list in a statement to +972 and Local Call).

“We didn’t know who the junior operatives were, because Israel
didn’t track them routinely [before the war],” explained senior
officer B. to +972 and Local Call, illuminating the reason behind the
development of this particular target machine for the current war.
“They wanted to allow us to attack [the junior operatives]
automatically. That’s the Holy Grail. Once you go automatic, target
generation goes crazy.”

The sources said that the approval to automatically adopt Lavender’s
kill lists, which had previously been used only as an auxiliary tool,
was granted about two weeks into the war, after intelligence personnel
“manually” checked the accuracy of a random sample of several
hundred targets selected by the AI system. When that sample found that
Lavender’s results had reached 90 percent accuracy in identifying an
individual’s affiliation with Hamas, the army authorized the
sweeping use of the system. From that moment, sources said that if
Lavender decided an individual was a militant in Hamas, they were
essentially asked to treat that as an order, with no requirement to
independently check why the machine made that choice or to examine the
raw intelligence data on which it is based.

“At 5 a.m., [the air force] would come and bomb all the houses
that we had marked,” B. said. “We took out thousands of people. We
didn’t go through them one by one — we put everything into
automated systems, and as soon as one of [the marked individuals] was
at home, he immediately became a target. We bombed him and his
house.”

“It was very surprising for me that we were asked to bomb a house to
kill a ground soldier, whose importance in the fighting was so low,”
said one source about the use of AI to mark alleged low-ranking
militants. “I nicknamed those targets ‘garbage targets.’ Still,
I found them more ethical than the targets that we bombed just for
‘deterrence’
[[link removed]] —
highrises that are evacuated and toppled just to cause destruction.”

The deadly results of this loosening of restrictions in the early
stage of the war were staggering. According to data from the
Palestinian Health Ministry in Gaza, on which the Israeli army
has relied almost exclusively
[[link removed]] since
the beginning of the war, Israel killed some 15,000 Palestinians —
almost half of the death toll so far — in the first six weeks
[[link removed]] of
the war, up until a week-long ceasefire was agreed on Nov. 24.

[Massive destruction is seen in Al-Rimal popular district of Gaza
City after it was targeted by airstrikes carried out by Israeli
colonial, October 10, 2023. (Mohammed Zaanoun)]
[[link removed]]
Massive destruction is seen in Al-Rimal popular district of Gaza City
after it was targeted by airstrikes carried out by Israeli colonial,
October 10, 2023. (Mohammed Zaanoun)
Massive destruction is seen in Al-Rimal popular district of Gaza City
after it was targeted by airstrikes carried out by Israeli forces,
October 10, 2023. (Mohammed Zaanoun/Activestills)

‘The more information and variety, the better’

The Lavender software analyzes information collected on most of the
2.3 million residents of the Gaza Strip through a system of mass
surveillance, then assesses and ranks the likelihood that each
particular person is active in the military wing of Hamas or PIJ.
According to sources, the machine gives almost every single person in
Gaza a rating from 1 to 100, expressing how likely it is that they are
a militant. 

Lavender learns to identify characteristics of known Hamas and PIJ
operatives, whose information was fed to the machine as training data,
and then to locate these same characteristics — also called
“features” — among the general population, the sources
explained. An individual found to have several different incriminating
features will reach a high rating, and thus automatically becomes a
potential target for assassination. 

In “The Human-Machine Team,” the book referenced at the beginning
of this article, the current commander of Unit 8200 advocates for such
a system without referencing Lavender by name. (The commander himself
also isn’t named, but five sources in 8200 confirmed that the
commander is the author, as reported
[[link removed]] also
by Haaretz.) Describing human personnel as a “bottleneck” that
limits the army’s capacity during a military operation, the
commander laments: “We [humans] cannot process so much information.
It doesn’t matter how many people you have tasked to produce targets
during the war — you still cannot produce enough targets per day.”

The solution to this problem, he says, is artificial intelligence. The
book offers a short guide to building a “target machine,” similar
in description to Lavender, based on AI and machine-learning
algorithms. Included in this guide are several examples of the
“hundreds and thousands” of features that can increase an
individual’s rating, such as being in a Whatsapp group with a known
militant, changing cell phone every few months, and changing addresses
frequently. 

“The more information, and the more variety, the better,” the
commander writes. “Visual information, cellular information, social
media connections, battlefield information, phone contacts, photos.”
While humans select these features at first, the commander continues,
over time the machine will come to identify features on its own. This,
he says, can enable militaries to create “tens of thousands of
targets,” while the actual decision as to whether or not to attack
them will remain a human one.

The book isn’t the only time a senior Israeli commander hinted at
the existence of human target machines like Lavender. +972 and Local
Call have obtained footage of a private lecture given by the commander
of Unit 8200’s secretive Data Science and AI center, “Col.
Yoav,” at Tel Aviv University’s AI week in 2023, which
was reported on
[[link removed]] at
the time in the Israeli media.

In the lecture, the commander speaks about a new, sophisticated target
machine used by the Israeli army that detects “dangerous people”
based on their likeness to existing lists of known militants on which
it was trained. “Using the system, we managed to identify Hamas
missile squad commanders,” “Col. Yoav” said in the lecture,
referring to Israel’s May 2021 military operation in Gaza, when the
machine was used for the first time. 

[Slides from a lecture presentation by the commander of IDF Unit
8200’s Data Science and AI center at Tel Aviv University in 2023.]
[[link removed]] Slides
from a lecture presentation by the commander of IDF Unit 8200’s Data
Science and AI center at Tel Aviv University in 2023, obtained by +972
and Local Call.
Slides from a lecture presentation by the commander of IDF Unit
8200’s Data Science and AI center at Tel Aviv University in 2023,
obtained by +972 and Local Call.

[Slides from a lecture presentation by the commander of IDF Unit
8200’s Data Science and AI center at Tel Aviv University in 2023,
obtained by +972 and Local Call.]
[[link removed]] Slides
from a lecture presentation by the commander of IDF Unit 8200’s Data
Science and AI center at Tel Aviv University in 2023, obtained by +972
and Local Call.
Slides from a lecture presentation by the commander of IDF Unit
8200’s Data Science and AI center at Tel Aviv University in 2023,
obtained by +972 and Local Call.

The lecture presentation slides, also obtained by +972 and Local Call,
contain illustrations of how the machine works: it is fed data about
existing Hamas operatives, it learns to notice their features, and
then it rates other Palestinians based on how similar they are to the
militants. 

“We rank the results and determine the threshold [at which to attack
a target],” “Col. Yoav” said in the lecture, emphasizing that
“eventually, people of flesh and blood take the decisions. In the
defense realm, ethically speaking, we put a lot of emphasis on this.
These tools are meant to help [intelligence officers] break their
barriers.” 

In practice, however, sources who have used Lavender in recent months
say human agency and precision were substituted by mass target
creation and lethality.

‘There was no “zero-error” policy’

B., a senior officer who used Lavender, echoed to +972 and Local Call
that in the current war, officers were not required to independently
review the AI system’s assessments, in order to save time and enable
the mass production of human targets without hindrances. 

“Everything was statistical, everything was neat — it was very
dry,” B. said. He noted that this lack of supervision was permitted
despite internal checks showing that Lavender’s calculations were
considered accurate only 90 percent of the time; in other words, it
was known in advance that 10 percent of the human targets slated for
assassination were not members of the Hamas military wing at all.

For example, sources explained that the Lavender machine sometimes
mistakenly flagged individuals who had communication patterns similar
to known Hamas or PIJ operatives — including police and civil
defense workers, militants’ relatives, residents who happened to
have a name and nickname identical to that of an operative, and Gazans
who used a device that once belonged to a Hamas operative. 

“How close does a person have to be to Hamas to be [considered by an
AI machine to be] affiliated with the organization?” said one source
critical of Lavender’s inaccuracy. “It’s a vague boundary. Is a
person who doesn’t receive a salary from Hamas, but helps them with
all sorts of things, a Hamas operative? Is someone who was in Hamas in
the past, but is no longer there today, a Hamas operative? Each of
these features — characteristics that a machine would flag as
suspicious — is inaccurate.”

[Palestinians at the site of an Israeli airstrike in Rafah, in the
southern Gaza Strip, February 24, 2024. (Abed Rahim Khatib/Flash90)]
[[link removed]]
Palestinians at the site of an Israeli airstrike in Rafah, in the
southern Gaza Strip, February 24, 2024. (Abed Rahim Khatib/Flash90)
Palestinians at the site of an Israeli airstrike in Rafah, in the
southern Gaza Strip, February 24, 2024. (Abed Rahim Khatib/Flash90)

Similar problems exist with the ability of target machines to assess
the phone used by an individual marked for assassination. “In war,
Palestinians change phones all the time,” said the source. “People
lose contact with their families, give their phone to a friend or a
wife, maybe lose it. There is no way to rely 100 percent on the
automatic mechanism that determines which [phone] number belongs to
whom.”

According to the sources, the army knew that the minimal human
supervision in place would not discover these faults. “There was no
‘zero-error’ policy. Mistakes were treated statistically,” said
a source who used Lavender. “Because of the scope and magnitude, the
protocol was that even if you don’t know for sure that the machine
is right, you know that statistically it’s fine. So you go for
it.”

“It has proven itself,” said B., the senior source. “There’s
something about the statistical approach that sets you to a certain
norm and standard. There has been an illogical amount of [bombings] in
this operation. This is unparalleled, in my memory. And I have much
more trust in a statistical mechanism than a soldier who lost a friend
two days ago. Everyone there, including me, lost people on October 7.
The machine did it coldly. And that made it easier.”

Another intelligence source, who defended the reliance on the
Lavender-generated kill lists of Palestinian suspects, argued that it
was worth investing an intelligence officer’s time only to verify
the information if the target was a senior commander in Hamas. “But
when it comes to a junior militant, you don’t want to invest
manpower and time in it,” he said. “In war, there is no time to
incriminate every target. So you’re willing to take the margin of
error of using artificial intelligence, risking collateral damage and
civilians dying, and risking attacking by mistake, and to live with
it.”

B. said that the reason for this automation was a constant push to
generate more targets for assassination. “In a day without targets
[whose feature rating was sufficient to authorize a strike], we
attacked at a lower threshold. We were constantly being pressured:
‘Bring us more targets.’ They really shouted at us. We finished
[killing] our targets very quickly.”

He explained that when lowering the rating threshold of Lavender, it
would mark more people as targets for strikes. “At its peak, the
system managed to generate 37,000 people as potential human
targets,” said B. “But the numbers changed all the time, because
it depends on where you set the bar of what a Hamas operative is.
There were times when a Hamas operative was defined more broadly, and
then the machine started bringing us all kinds of civil defense
personnel, police officers, on whom it would be a shame to waste
bombs. They help the Hamas government, but they don’t really
endanger soldiers.”

[Palestinians at the site of a building destroyed by an Israeli
airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed
Rahim Khatib/Flash90)]
[[link removed]]
Palestinians at the site of a building destroyed by an Israeli
airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed
Rahim Khatib/Flash90)
Palestinians at the site of a building destroyed by an Israeli
airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed
Rahim Khatib/Flash90)

One source who worked with the military data science team that trained
Lavender said that data collected from employees of the Hamas-run
Internal Security Ministry, whom he does not consider to be militants,
was also fed into the machine. “I was bothered by the fact that when
Lavender was trained, they used the term ‘Hamas operative’
loosely, and included people who were civil defense workers in the
training dataset,” he said.

The source added that even if one believes these people deserve to be
killed, training the system based on their communication profiles made
Lavender more likely to select civilians by mistake when its
algorithms were applied to the general population. “Since it’s an
automatic system that isn’t operated manually by humans, the meaning
of this decision is dramatic: it means you’re including many people
with a civilian communication profile as potential targets.”

‘We only checked that the target was a man’

The Israeli military flatly rejects these claims. In a statement to
+972 and Local Call, the IDF Spokesperson denied using artificial
intelligence to incriminate targets, saying these are merely
“auxiliary tools that assist officers in the process of
incrimination.” The statement went on: “In any case, an
independent examination by an [intelligence] analyst is required,
which verifies that the identified targets are legitimate targets for
attack, in accordance with the conditions set forth in IDF directives
and international law.”  

However, sources said that the only human supervision protocol in
place before bombing the houses of suspected “junior” militants
marked by Lavender was to conduct a single check: ensuring that the
AI-selected target is male rather than female. The assumption in the
army was that if the target was a woman, the machine had likely made a
mistake, because there are no women among the ranks of the military
wings of Hamas and PIJ.

“A human being had to [verify the target] for just a few seconds,”
B. said, explaining that this became the protocol after realizing the
Lavender system was “getting it right” most of the time. “At
first, we did checks to ensure that the machine didn’t get confused.
But at some point we relied on the automatic system, and we only
checked that [the target] was a man — that was enough. It doesn’t
take a long time to tell if someone has a male or a female voice.” 

To conduct the male/female check, B. claimed that in the current war,
“I would invest 20 seconds for each target at this stage, and do
dozens of them every day. I had zero added value as a human, apart
from being a stamp of approval. It saved a lot of time. If [the
operative] came up in the automated mechanism, and I checked that he
was a man, there would be permission to bomb him, subject to an
examination of collateral damage.”

[Palestinians emerge from the rubble of houses destroyed in Israeli
airstrikes in the city of Rafah, southern Gaza Strip, November 20,
2023. (Abed Rahim Khatib/Flash90)]
[[link removed]]
Palestinians emerge from the rubble of houses destroyed in Israeli
airstrikes in the city of Rafah, southern Gaza Strip, November 20,
2023. (Abed Rahim Khatib/Flash90)
Palestinians emerge from the rubble of houses destroyed in Israeli
airstrikes in the city of Rafah, southern Gaza Strip, November 20,
2023. (Abed Rahim Khatib/Flash90)

In practice, sources said this meant that for civilian men marked in
error by Lavender, there was no supervising mechanism in place to
detect the mistake. According to B., a common error occurred “if the
[Hamas] target gave [his phone] to his son, his older brother, or just
a random man. That person will be bombed in his house with his family.
This happened often. These were most of the mistakes caused by
Lavender,” B. said.

STEP 2: LINKING TARGETS TO FAMILY HOMES

‘Most of the people you killed were women and children’

The next stage in the Israeli army’s assassination procedure is
identifying where to attack the targets that Lavender generates.

In a statement to +972 and Local Call, the IDF Spokesperson claimed in
response to this article that “Hamas places its operatives and
military assets in the heart of the civilian population,
systematically uses the civilian population as human shields, and
conducts fighting from within civilian structures, including sensitive
sites such as hospitals, mosques, schools and UN facilities. The IDF
is bound by and acts according to international law, directing its
attacks only at military targets and military operatives.” 

The six sources we spoke to echoed this to some degree, saying that
Hamas’ extensive tunnel system
[[link removed]] deliberately
passes under hospitals and schools; that Hamas militants use
ambulances to get around; and that countless military assets have been
situated near civilian buildings. The sources argued that many Israeli
strikes kill civilians as a result of these tactics by Hamas — a
characterization that human rights groups warn
[[link removed]] evades
Israel’s onus for inflicting the casualties. 

However, in contrast to the Israeli army’s official statements, the
sources explained that a major reason for the unprecedented death toll
from Israel’s current bombardment is the fact that the army has
systematically attacked targets in their private homes, alongside
their families — in part because it was easier from an intelligence
standpoint to mark family houses using automated systems.

Indeed, several sources emphasized that, as opposed to numerous cases
of Hamas operatives engaging in military activity from civilian areas,
in the case of systematic assassination strikes, the army routinely
made the active choice to bomb suspected militants when inside
civilian households from which no military activity took place. This
choice, they said, was a reflection of the way Israel’s system of
mass surveillance in Gaza is designed.

[Palestinians rush to bring the wounded, including many children, to
Al-Shifa Hospital in Gaza City as Israeli forces continue pounding the
Gaza Strip, October 11, 2023. (Mohammed Zaanoun/Activestills)]
[[link removed]]
Palestinians rush to bring the wounded, including many children, to
Al-Shifa Hospital in Gaza City as Israeli forces continue pounding the
Gaza Strip, October 11, 2023. (Mohammed Zaanoun/Activestills)
Palestinians rush to bring the wounded, including many children, to
Al-Shifa Hospital in Gaza City as Israeli forces continue pounding the
Gaza Strip, October 11, 2023. (Mohammed Zaanoun/Activestills)

The sources told +972 and Local Call that since everyone in Gaza had a
private house with which they could be associated, the army’s
surveillance systems could easily and automatically “link”
individuals to family houses. In order to identify the moment
operatives enter their houses in real time, various additional
automatic softwares have been developed. These programs track
thousands of individuals simultaneously, identify when they are at
home, and send an automatic alert to the targeting officer, who then
marks the house for bombing. One of several of these tracking
softwares, revealed here for the first time, is called “Where’s
Daddy?” 

“You put hundreds [of targets] into the system and wait to see who
you can kill,” said one source with knowledge of the system.
“It’s called broad hunting: you copy-paste from the lists that the
target system produces.”

Evidence of this policy is also clear from the data: during the first
month of the war, more than half of the fatalities — 6,120 people
— belonged to 1,340 families, many of which were completely wiped
out while inside their homes, according to UN figures
[[link removed]].
The proportion of entire families
[[link removed]] bombed
in their houses in the current war is much higher than in the 2014
Israeli operation
[[link removed]] in Gaza
(which was previously Israel’s deadliest war on the Strip), further
suggesting the prominence of this policy.

Another source said that each time the pace of assassinations waned,
more targets were added to systems like Where’s Daddy? to locate
individuals that entered their homes and could therefore be bombed. He
said that the decision of who to put into the tracking systems could
be made by relatively low-ranking officers in the military
hierarchy. 

“One day, totally of my own accord, I added something like 1,200 new
targets to the [tracking] system, because the number of attacks [we
were conducting] decreased,” the source said. “That made sense to
me. In retrospect, it seems like a serious decision I made. And such
decisions were not made at high levels.”

The sources said that in the first two weeks of the war, “several
thousand” targets were initially inputted into locating programs
like Where’s Daddy?. These included all the members of Hamas’
elite special forces unit the Nukhba, all of Hamas’ anti-tank
operatives, and anyone who entered Israel on October 7. But before
long, the kill list was drastically expanded. 

“In the end it was everyone [marked by Lavender],” one source
explained. “Tens of thousands. This happened a few weeks later, when
the [Israeli] brigades entered Gaza, and there were already fewer
uninvolved people [i.e. civilians] in the northern areas.” According
to this source, even some minors were marked by Lavender as targets
for bombing. “Normally, operatives are over the age of 17, but that
was not a condition.”

[Wounded Palestinians are treated on the floor due to overcrowding at
Al-Shifa Hospital, Gaza City, central Gaza Strip, October 18, 2023.
(Mohammed Zaanoun/Activestills)]
[[link removed]]
Wounded Palestinians are treated on the floor due to overcrowding at
Al-Shifa Hospital, Gaza City, central Gaza Strip, October 18, 2023.
(Mohammed Zaanoun/Activestills)
Wounded Palestinians are treated on the floor due to overcrowding at
Al-Shifa Hospital, Gaza City, central Gaza Strip, October 18, 2023.
(Mohammed Zaanoun/Activestills)

Lavender and systems like Where’s Daddy? were thus combined with
deadly effect, killing entire families, sources said. By adding a name
from the Lavender-generated lists to the Where’s Daddy? home
tracking system, A. explained, the marked person would be placed under
ongoing surveillance, and could be attacked as soon as they set foot
in their home, collapsing the house on everyone inside.

“Let’s say you calculate [that there is one] Hamas [operative]
plus 10 [civilians in the house],” A. said. “Usually, these 10
will be women and children. So absurdly, it turns out that most of the
people you killed were women and children.”

STEP 3: CHOOSING A WEAPON

‘We usually carried out the attacks with “dumb bombs”’

Once Lavender has marked a target for assassination, army personnel
have verified that they are male, and tracking software has located
the target in their home, the next stage is picking the munition with
which to bomb them.

In December 2023, CNN reported
[[link removed]] that
according to U.S. intelligence estimates, about 45 percent of the
munitions used by the Israeli air force in Gaza were “dumb” bombs,
which are known to cause more collateral damage than guided bombs. In
response to the CNN report, an army spokesperson quoted in the article
said: “As a military committed to international law and a moral code
of conduct, we are devoting vast resources to minimizing harm to the
civilians that Hamas has forced into the role of human shields. Our
war is against Hamas, not against the people of Gaza.”

Three intelligence sources, however, told +972 and Local Call that
junior operatives marked by Lavender were assassinated only with dumb
bombs, in the interest of saving more expensive armaments. The
implication, one source explained, was that the army would not strike
a junior target if they lived in a high-rise building, because the
army did not want to spend a more precise and expensive “floor
bomb” (with more limited collateral effect) to kill him. But if a
junior target lived in a building with only a few floors, the army was
authorized to kill him and everyone in the building with a dumb bomb.

[Palestinians at the site of a building destroyed by an Israeli
airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed
Rahim Khatib/Flash90)]
[[link removed]]
Palestinians at the site of a building destroyed by an Israeli
airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed
Rahim Khatib/Flash90)
Palestinians at the site of a building destroyed by an Israeli
airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed
Rahim Khatib/Flash90)

“It was like that with all the junior targets,” testified C., who
used various automated programs in the current war. “The only
question was, is it possible to attack the building in terms of
collateral damage? Because we usually carried out the attacks with
dumb bombs, and that meant literally destroying the whole house on top
of its occupants. But even if an attack is averted, you don’t care
— you immediately move on to the next target. Because of the system,
the targets never end. You have another 36,000 waiting.”

STEP 4: AUTHORIZING CIVILIAN CASUALTIES

‘We attacked almost without considering collateral damage’

One source said that when attacking junior operatives, including those
marked by AI systems like Lavender, the number of civilians they were
allowed to kill alongside each target was fixed during the initial
weeks of the war at up to 20. Another source claimed the fixed number
was up to 15. These “collateral damage degrees,” as the military
calls them, were applied broadly to all suspected junior militants,
the sources said, regardless of their rank, military importance, and
age, and with no specific case-by-case examination to weigh the
military advantage of assassinating them against the expected harm to
civilians. 

According to A., who was an officer in a target operation room in the
current war, the army’s international law department has never
before given such “sweeping approval” for such a high collateral
damage degree. “It’s not just that you can kill any person who is
a Hamas soldier, which is clearly permitted and legitimate in terms of
international law,” A. said. “But they directly tell you: ‘You
are allowed to kill them along with many civilians.’ 

“Every person who wore a Hamas uniform in the past year or two could
be bombed with 20 [civilians killed as] collateral damage, even
without special permission,” A. continued. “In practice, the
principle of proportionality did not exist.”

According to A., this was the policy for most of the time that he
served. Only later did the military lower the collateral damage
degree. “In this calculation, it could also be 20 children for a
junior operative … It really wasn’t like that in the past,” A.
explained. Asked about the security rationale behind this policy, A.
replied: “Lethality.”

[Palestinians wait to receive the bodies of their relatives who were
killed in Israeli airstrikes, at Al-Najjar Hospital in Rafah, southern
Gaza Strip, November 7, 2023. (Abed Rahim Khatib/Flash90)]
[[link removed]]
Palestinians wait to receive the bodies of their relatives who were
killed in Israeli airstrikes, at Al-Najjar Hospital in Rafah, southern
Gaza Strip, November 7, 2023. (Abed Rahim Khatib/Flash90)
Palestinians wait to receive the bodies of their relatives who were
killed in Israeli airstrikes, at Al-Najjar Hospital in Rafah, southern
Gaza Strip, November 7, 2023. (Abed Rahim Khatib/Flash90)

The predetermined and fixed collateral damage degree helped accelerate
the mass creation of targets using the Lavender machine, sources said,
because it saved time. B. claimed that the number of civilians they
were permitted to kill in the first week of the war per suspected
junior militant marked by AI was fifteen, but that this number “went
up and down” over time. 

“At first we attacked almost without considering collateral
damage,” B. said of the first week after October 7. “In practice,
you didn’t really count people [in each house that is bombed],
because you couldn’t really tell if they’re at home or not. After
a week, restrictions on collateral damage began. The number dropped
[from 15] to five, which made it really difficult for us to attack,
because if the whole family was home, we couldn’t bomb it. Then they
raised the number again.”

‘We knew we would kill over 100 civilians’

Sources told +972 and Local Call that now, partly due to American
pressure, the Israeli army is no longer mass-generating junior human
targets for bombing in civilian homes. The fact that most homes in the
Gaza Strip were already destroyed or damaged, and almost the entire
population has been displaced, also impaired the army’s ability to
rely on intelligence databases and automated house-locating
programs. 

E. claimed that the massive bombardment of junior militants took place
only in the first week or two of the war, and then was stopped mainly
so as not to waste bombs. “There is a munitions economy,” E. said.
“They were always afraid that there would be [a war] in the northern
arena [with Hezbollah in Lebanon]. They don’t attack these kinds of
[junior] people at all anymore.” 

However, airstrikes against senior ranking Hamas commanders are still
ongoing, and sources said that for these attacks, the military is
authorizing the killing of “hundreds” of civilians per target —
an official policy for which there is no historical precedent in
Israel, or even in recent U.S. military operations.

“In the bombing of the commander of the Shuja’iya Battalion, we
knew that we would kill over 100 civilians,” B. recalled of a Dec. 2
bombing that the IDF Spokesperson said
[[link removed]] was
aimed at assassinating Wisam Farhat. “For me, psychologically, it
was unusual. Over 100 civilians — it crosses some red line.”

[A ball of fire and smoke rises during Israeli airstrikes in the Gaza
Strip, October 9, 2023. (Atia Mohammed/Flash90)]
[[link removed]] A
ball of fire and smoke rises during Israeli airstrikes in the Gaza
Strip, October 9, 2023. (Atia Mohammed/Flash90)
A ball of fire and smoke rises during Israeli airstrikes in the Gaza
Strip, October 9, 2023. (Atia Mohammed/Flash90)

Amjad Al-Sheikh, a young Palestinian from Gaza, said many of his
family members were killed in that bombing. A resident of Shuja’iya,
east of Gaza City, he was at a local supermarket that day when he
heard five blasts that shattered the glass windows. 

“I ran to my family’s house, but there were no buildings there
anymore,” Al-Sheikh told +972 and Local Call. “The street was
filled with screams and smoke. Entire residential blocks turned to
mountains of rubble and deep pits. People began to search in the
cement, using their hands, and so did I, looking for signs of my
family’s house.” 

Al-Sheikh’s wife and baby daughter survived — protected from the
rubble by a closet that fell on top of them — but he found 11 other
members of his family, among them his sisters, brothers, and their
young children, dead under the rubble. According to
[[link removed]] the
human rights group B’Tselem, the bombing that day destroyed dozens
of buildings, killed dozens of people, and buried hundreds under the
ruins of their homes.

‘Entire families were killed’

Intelligence sources told +972 and Local Call they took part in even
deadlier strikes. In order to assassinate Ayman Nofal, the commander
of Hamas’ Central Gaza Brigade, a source said the army authorized
the killing of approximately 300 civilians, destroying several
buildings [[link removed]] in
airstrikes on Al-Bureij refugee camp on Oct. 17, based on an imprecise
pinpointing of Nofal. Satellite footage and videos
[[link removed]] from the
scene show the destruction of several large multi-storey apartment
buildings.

“Between 16 to 18 houses were wiped out in the attack,” Amro
Al-Khatib, a resident of the camp, told +972 and Local Call. “We
couldn’t tell one apartment from the other — they all got mixed up
in the rubble, and we found human body parts everywhere.”

In the aftermath, Al-Khatib recalled around 50 dead bodies being
pulled out of the rubble, and around 200 people wounded, many of them
gravely. But that was just the first day. The camp’s residents spent
five days pulling the dead and injured out, he said.

[Palestinians digging with bear hands find a dead body in the rubble
after an Israeli airstrike which killed dozens Palestinians in the
middle of Al-Maghazi refugee camp, central Gaza Strip, November 5,
2023. (Mohammed Zaanoun/Activestills)]
[[link removed]]
Palestinians digging with bear hands find a dead body in the rubble
after an Israeli airstrike which killed dozens Palestinians in the
middle of Al-Maghazi refugee camp, central Gaza Strip, November 5,
2023. (Mohammed Zaanoun/Activestills)
Palestinians digging with bear hands find a dead body in the rubble
after an Israeli airstrike which killed dozens Palestinians in the
middle of Al-Maghazi refugee camp, central Gaza Strip, November 5,
2023. (Mohammed Zaanoun/Activestills)

Nael Al-Bahisi, a paramedic, was one of the first on the scene. He
counted between 50-70 casualties on that first day. “At a certain
moment, we understood the target of the strike was Hamas commander
Ayman Nofal,” he told +972 and Local Call. “They killed him, and
also many people who didn’t know he was there. Entire families with
children were killed.”

Another intelligence source told +972 and Local Call that the
army destroyed a high-rise building
[[link removed]] in
Rafah in mid-December, killing “dozens of civilians,” in order to
try to kill
[[link removed]] Mohammed
Shabaneh, the commander of Hamas’ Rafah Brigade (it is not clear
whether or not he was killed in the attack). Often, the source said,
the senior commanders hide in tunnels that pass under civilian
buildings, and therefore the choice to assassinate them with an
airstrike necessarily kills civilians.

“Most of those injured were children,” said Wael Al-Sir, 55, who
witnessed the large-scale strike believed by some Gazans to have been
the assassination attempt. He told +972 and Local Call that the
bombing on Dec. 20 destroyed an “entire residential block” and
killed at least 10 children.

“There was a completely permissive policy regarding the casualties
of [bombing] operations — so permissive that in my opinion it had an
element of revenge,” D., an intelligence source, claimed. “The
core of this was the assassinations of senior [Hamas and PIJ
commanders] for whom they were willing to kill hundreds of civilians.
We had a calculation: how many for a brigade commander, how many for a
battalion commander, and so on.”

“There were regulations, but they were just very lenient,” said
E., another intelligence source. “We’ve killed people with
collateral damage in the high double-digits, if not low triple-digits.
These are things that haven’t happened before.”

[Palestinians inspect their homes and try to rescue their relatives
from under the rubble after an Israeli airstrike in the city of Rafah,
southern Gaza Strip, October 22, 2023. (Abed Rahim Khatib/Flash90)]
[[link removed]]
Palestinians inspect their homes and try to rescue their relatives
from under the rubble after an Israeli airstrike in the city of Rafah,
southern Gaza Strip, October 22, 2023. (Abed Rahim Khatib/Flash90)
Palestinians inspect their homes and try to rescue their relatives
from under the rubble after an Israeli airstrike in the city of Rafah,
southern Gaza Strip, October 22, 2023. (Abed Rahim Khatib/Flash90)

Such a high rate of “collateral damage” is exceptional not only
compared to what the Israeli army previously deemed acceptable, but
also compared to the wars waged by the United States in Iraq, Syria,
and Afghanistan. 

General Peter Gersten, Deputy Commander for Operations and
Intelligence in the operation to fight ISIS in Iraq and Syria, told
[[link removed]] a
U.S. defense magazine in 2021 that an attack with collateral damage of
15 civilians deviated from procedure; to carry it out, he had to
obtain special permission from the head of the U.S. Central Command,
General Lloyd Austin, who is now Secretary of Defense. 

“With Osama Bin Laden, you’d have an NCV [Non-combatant Casualty
Value] of 30, but if you had a low-level commander, his NCV was
typically zero,” Gersten said. “We ran zero for the longest
time.”

‘We were told: “Whatever you can, bomb”’

All the sources interviewed for this investigation said that Hamas’
massacres on October 7 and kidnapping of hostages greatly influenced
the army’s fire policy and collateral damage degrees. “At first,
the atmosphere was painful and vindictive,” said B., who was drafted
into the army immediately after October 7, and served in a target
operation room. “The rules were very lenient. They took down four
buildings when they knew the target was in one of them. It was crazy.

“There was a dissonance: on the one hand, people here were
frustrated that we were not attacking enough,” B. continued. “On
the other hand, you see at the end of the day that another thousand
Gazans have died, most of them civilians.”

“There was hysteria in the professional ranks,” said D., who was
also drafted immediately after October 7. “They had no idea how to
react at all. The only thing they knew to do was to just start bombing
like madmen to try to dismantle Hamas’ capabilities.”

[Defence Minister Yoav Gallant speaks with Israeli soldiers at a
staging area not far from the Gaza fence, October 19, 2023. (Chaim
Goldberg/Flash90)]
[[link removed]]
Defence Minister Yoav Gallant speaks with Israeli soldiers at a
staging area not far from the Gaza fence, October 19, 2023. (Chaim
Goldberg/Flash90)
Defence Minister Yoav Gallant speaks with Israeli soldiers at a
staging area not far from the Gaza fence, October 19, 2023. (Chaim
Goldberg/Flash90)

D. stressed that they were not explicitly told that the army’s goal
was “revenge,” but expressed that “as soon as every target
connected to Hamas becomes legitimate, and with almost any collateral
damage being approved, it is clear to you that thousands of people are
going to be killed. Even if officially every target is connected to
Hamas, when the policy is so permissive, it loses all meaning.”

A. also used the word “revenge” to describe the atmosphere inside
the army after October 7. “No one thought about what to do
afterward, when the war is over, or how it will be possible to live in
Gaza and what they will do with it,” A. said. “We were told: now
we have to fuck up Hamas, no matter what the cost. Whatever you can,
you bomb.”

B., the senior intelligence source, said that in retrospect, he
believes this “disproportionate” policy of killing Palestinians in
Gaza also endangers Israelis, and that this was one of the reasons he
decided to be interviewed.

“In the short term, we are safer, because we hurt Hamas. But I think
we’re less secure in the long run. I see how all the bereaved
families in Gaza — which is nearly everyone — will raise the
motivation for [people to join] Hamas 10 years down the line. And it
will be much easier for [Hamas] to recruit them.”

In a statement to +972 and Local Call, the Israeli army denied much of
what the sources told us, claiming that “each target is examined
individually, while an individual assessment is made of the military
advantage and collateral damage expected from the attack … The IDF
does not carry out attacks when the collateral damage expected from
the attack is excessive in relation to the military advantage.”

STEP 5: CALCULATING COLLATERAL DAMAGE

‘The model was not connected to reality’

According to the intelligence sources, the Israeli army’s
calculation of the number of civilians expected to be killed in each
house alongside a target — a procedure examined in a previous
investigation
[[link removed]] by
+972 and Local Call — was conducted with the help of automatic and
inaccurate tools. In previous wars, intelligence personnel would spend
a lot of time verifying how many people were in a house that was set
to be bombed, with the number of civilians liable to be killed listed
as part of a “target file.” After October 7, however, this
thorough verification was largely abandoned in favor of automation. 

In October, The New York Times reported
[[link removed]] on
a system operated from a special base in southern Israel, which
collects information from mobile phones in the Gaza Strip and provided
the military with a live estimate of the number of Palestinians who
fled the northern Gaza Strip southward. Brig. General Udi Ben Muha
told the Times that “It’s not a 100 percent perfect system — but
it gives you the information you need to make a decision.” The
system operates according to colors: red marks areas where there are
many people, and green and yellow mark areas that have been relatively
cleared of residents. 

[Palestinians walk on a main road after fleeing from their homes in
Gaza City to the southern part of Gaza, November 10, 2023. (Atia
Mohammed/Flash90)]
[[link removed]]
Palestinians walk on a main road after fleeing from their homes in
Gaza City to the southern part of Gaza, November 10, 2023. (Atia
Mohammed/Flash90)
Palestinians walk on a main road after fleeing from their homes in
Gaza City to the southern part of Gaza, November 10, 2023. (Atia
Mohammed/Flash90)

The sources who spoke to +972 and Local Call described a similar
system for calculating collateral damage, which was used to decide
whether to bomb a building in Gaza. They said that the software
calculated the number of civilians residing in each home before the
war — by assessing the size of the building and reviewing its list
of residents — and then reduced those numbers by the proportion of
residents who supposedly evacuated the neighborhood. 

To illustrate, if the army estimated that half of a neighborhood’s
residents had left, the program would count a house that usually had
10 residents as a house containing five people. To save time, the
sources said, the army did not surveil the homes to check how many
people were actually living there, as it did in previous operations,
to find out if the program’s estimate was indeed accurate.

“This model was not connected to reality,” claimed one source.
“There was no connection between those who were in the home now,
during the war, and those who were listed as living there prior to the
war. [On one occasion] we bombed a house without knowing that there
were several families inside, hiding together.” 

The source said that although the army knew that such errors could
occur, this imprecise model was adopted nonetheless, because it was
faster. As such, the source said, “the collateral damage calculation
was completely automatic and statistical” — even producing figures
that were not whole numbers.

STEP 6: BOMBING A FAMILY HOME

‘You killed a family for no reason’

The sources who spoke to +972 and Local Call explained that there was
sometimes a substantial gap between the moment that tracking systems
like Where’s Daddy? alerted an officer that a target had entered
their house, and the bombing itself — leading to the killing of
whole families even without hitting the army’s target. “It
happened to me many times that we attacked a house, but the person
wasn’t even home,” one source said. “The result is that you
killed a family for no reason.”

Three intelligence sources told +972 and Local Call that they had
witnessed an incident in which the Israeli army bombed a family’s
private home, and it later turned out that the intended target of the
assassination was not even inside the house, since no further
verification was conducted in real time.

[Palestinians receive the bodies of relatives who were killed in
Israeli airstrikes, Al-Najjar Hospital, southern Gaza Strip, November
6, 2023. (Abed Rahim Khatib/Flash90)]
[[link removed]]
Palestinians receive the bodies of relatives who were killed in
Israeli airstrikes, Al-Najjar Hospital, southern Gaza Strip, November
6, 2023. (Abed Rahim Khatib/Flash90)
Palestinians receive the bodies of relatives who were killed in
Israeli airstrikes, Al-Najjar Hospital, southern Gaza Strip, November
6, 2023. (Abed Rahim Khatib/Flash90)

“Sometimes [the target] was at home earlier, and then at night he
went to sleep somewhere else, say underground, and you didn’t know
about it,” one of the sources said. “There are times when you
double-check the location, and there are times when you just say,
‘Okay, he was in the house in the last few hours, so you can just
bomb.’” 

Another source described a similar incident that affected him and made
him want to be interviewed for this investigation. “We understood
that the target was home at 8 p.m. In the end, the air force bombed
the house at 3 a.m. Then we found out [in that span of time] he had
managed to move himself to another house with his family. There were
two other families with children in the building we bombed.”

In previous wars in Gaza, after the assassination of human targets,
Israeli intelligence would carry out bomb damage assessment (BDA)
procedures — a routine post-strike check to see if the senior
commander was killed and how many civilians were killed along with
him. As revealed in a previous +972 and Local Call investigation
[[link removed]], this
involved listening in to phone calls of relatives who lost their loved
ones. In the current war, however, at least in relation to junior
militants marked using AI, sources say this procedure was abolished in
order to save time. The sources said they did not know how many
civilians were actually killed in each strike, and for the low-ranking
suspected Hamas and PIJ operatives marked by AI, they did not even
know whether the target himself was killed.

_Yuval Abraham is a journalist and filmmaker based in Jerusalem.
Recent articles. [[link removed]]_

_CAN WE COUNT ON YOUR SUPPORT 
[[link removed]]? +972 MAGAZINE IS A LEADING
MEDIA VOICE OF THIS MOVEMENT, A DESPERATELY NEEDED PLATFORM WHERE
PALESTINIAN AND ISRAELI JOURNALISTS, ACTIVISTS, AND THINKERS CAN
REPORT ON AND ANALYZE WHAT IS HAPPENING, GUIDED BY HUMANISM, EQUALITY,
AND JUSTICE. JOIN US.
[[link removed]]_

* Israel
[[link removed]]
* Gaza
[[link removed]]
* artificial intelligence
[[link removed]]

*
[[link removed]]
*
[[link removed]]
*
*
[[link removed]]

 

 

 

INTERPRET THE WORLD AND CHANGE IT

 

 

Submit via web
[[link removed]]

Submit via email
Frequently asked questions
[[link removed]]
Manage subscription
[[link removed]]
Visit xxxxxx.org
[[link removed]]

Twitter [[link removed]]

Facebook [[link removed]]

 




[link removed]

To unsubscribe, click the following link:
[link removed]
Screenshot of the email generated on import

Message Analysis

  • Sender: Portside
  • Political Party: n/a
  • Country: United States
  • State/Locality: n/a
  • Office: n/a
  • Email Providers:
    • L-Soft LISTSERV