Site icon News Portal NP

Human agency remains key in Israel’s AI-powered war tactics

Human agency remains key in Israel’s AI-powered war tactics

The devastation caused by Israel in Gaza paints a grim picture reminiscent of old-school warfare. Entire neighborhoods reduced to rubble, streets swallowed by craters, and a cloud of dust covering the sun. The sheer magnitude of explosives dropped on Gaza surpasses the bombs that leveled Hiroshima and Nagasaki during World War II. This destruction rivals some of the most devastating urban warfare episodes in history, from the Blitz of London to the Vietnam War.

In a stark contrast to past wars, Israel’s assault on Gaza is propelled by cutting-edge technology. Recent reports have unveiled the pivotal role of AI in the bloodshed, with military using AI, including machines named “Lavender” and “Where’s Daddy?” to generate targets for assassination based on alleged affiliations with militant groups. This high-tech approach has accelerated the killing process, raising concerns about the morality of such actions.

The revelation of AI’s involvement in targeting civilians raises ethical questions and challenges the notion of minimizing casualties during warfare. Israeli military officials have significantly lowered the criteria for selecting targets, resulting in hundreds of civilian deaths to eliminate a single military target. The emphasis has shifted to causing maximum damage rather than limiting civilian casualties.


An Israeli airstrike in Al-Bureij refugee camp in central Gaza, January 2, 2024. (Oren Ben Hakoon/Flash90)

An Israeli airstrike in Al-Bureij refugee camp in central Gaza, January 2, 2024. (Oren Ben Hakoon/Flash90)

Despite the use of AI-powered targeting systems, the ultimate responsibility for the carnage in Gaza lies with human decision-makers. The rapid generation of targets and quick authorization process have escalated the rate of civilian casualties, highlighting the need for ethical considerations in employing such technology in warfare.

Rapid generation, rapid authorization

With the daily death rate in Gaza skyrocketing, the Israeli military’s recourse to AI-powered systems for target identification has raised significant concerns. The rapid accumulation of targets and hasty authorization of strikes without due diligence have led to a surge in civilian casualties, underscoring the ethical dilemmas posed by AI weaponry.

Exit mobile version