Within hours of the Hamas attack on Israel last month, a Silicon Valley drone company called Skydio began receiving emails from the Israeli military. The requests were for the company’s short-range reconnaissance drones — small flying vehicles used by the U.S. Army to navigate obstacles autonomously and produce 3D scans of complex structures like buildings.
The company said yes. In the three weeks since the attack, Skydio has sent more than 100 drones to the Israeli Defense Forces, with more to come, according to Mark Valentine, the Skydio executive in charge of government contracts.
Skydio isn’t the only American tech company fielding orders. Israel’s ferocious campaign to eliminate Hamas from the Gaza Strip is creating new demand for cutting-edge defense technology — often supplied directly by newer, smaller manufacturers, outside the traditional nation-to-nation negotiations for military supplies.
Already, Israel is usingfrom Shield AI for close-quarters indoor combat and has reportedly requested 200 Switchblade 600 kamikaze drones from another U.S. company, . Jon Gruen, CEO of Fortem Technologies, which supplied Ukrainian forces with radar and autonomous anti-drone aircraft, said he was having “early-stage conversations” with Israelis about whether the company’s AI systems could work in the dense, urban environments in Gaza.
This surge of interest echoes the one driven by the even larger conflict in Ukraine, which has been afor new AI-powered defense technology — much of it ordered by the Ukrainian government directly from U.S. tech companies.
AI ethicists haveabout the Israeli military’s use of AI-driven technologies to target Palestinians, pointing to reports that the army used AI to strike more than since Hamas militants launched a deadly assault on Israel on Oct 7.
The Israeli defense ministry did not elaborate in response to questions about its use of AI.
These sophisticated platforms also pose a new challenge for the Biden administration. On Nov. 13, the U.S. began implementingto govern the responsible military use of such technologies. The policy, first unveiled in the Hague in February and endorsed by 45 other countries, is an effort to keep the military use of AI and autonomous systems within the international law of war.
But neither Israel nor Ukraine are signatories, leaving a growing hole in the young effort to keep high-tech weapons operating within agreed-upon lines.
Asked about Israel’s compliance with the U.S.-led declaration on military AI, a spokesperson for the State Department said “it is too early” to draw conclusions about why some countries have not endorsed the document, or to suggest that non-endorsing countries disagree with the declaration or will not adhere to its principles.
Mark Cancian, a senior adviser with the CSIS International Security Program, said in an interview that “it’s very difficult” to coordinate international agreement between nations on the military use of AI for two reasons: “One is that the technology is evolving so quickly that the description constraints you put on it today may no longer may not be relevant five years from now because the technology will be so different. The other thing is that so much of this technology is civilian, that it’s hard to restrict military development without also affecting civilian development.”
In Gaza, drones are being largely used for surveillance, scouting locations and looking for militants without risking soldiers’ lives, according to Israeli and U.S. military technology developers and observers interviewed for this story.
Israel discloses few specifics of how it uses this technology, andthe Israeli military is using unreliable AI recommendation systems to identify targets for lethal operations.
Ukrainian forces have usedto identify Russian soldiers, weapons and unit positions from social media and satellite feeds.
Observers say that Israel is a particularly fast-moving theater for new weaponry because it has a technically sophisticated military, large budget, and — crucially — close existing ties to the U.S. tech industry.
“The difference, now maybe more than ever, is the speed at which technology can move and the willingness of suppliers of that technology to deal directly with Israel,” said Arun Seraphin, executive director of the National Defense Industrial Association’s Institute for Emerging Technologies.
Though the weapons trade is subject to scrutiny and regulation, autonomous systems also raise special challenges. Unlike traditional military hardware, buyers are able to reconfigure these smart platforms for their own needs, adding a layer of inscrutability to how these systems are used.
While many of the U.S.-built, AI-enabled drones sent to Israel are not armed and not programmed by the manufacturers to identify specific vehicles or people, these airborne robots are designed to leave room for military customers to run their own custom software, which they often prefer to do, multiple manufacturers told POLITICO.
Shield AI co-founder Brandon Tseng confirmed that users are able to customize the Nova 2 drones that the IDF is using to search for barricaded shooters and civilians in buildings targeted by Hamas fighters.
Matt Mahmoudi, who authored Amnesty International’sdocumenting Israel’s use of facial recognition systems in Palestinian territories, told POLITICO that historically, U.S. technology companies contracting with Israeli defense authorities have had little insight or control over how their products are used by the Israeli government, pointing to several instances of the Israeli military running its own AI software on hardware imported from other countries to closely monitor the movement of Palestinians.
Complicating the issue are the blurred lines between military and non-military technology. In the industry, the term is “dual-use” — a system, like a drone-swarm equipped with computer-vision, that might be used for commercial purposes but could also be deployed in combat.
The Technology Policy Lab at the Center for a New American Security“dual-use technologies are more difficult to regulate at both the national and international levels” and notes that in order for the U.S. to best apply export controls, it “requires complementary commitment from technology-leading allies and partners.”
Exportable military-use AI systems can run the gamut from commercial products to autonomous weapons. Even in cases where AI-enabled systems are explicitly designed as weapons, meaning U.S. authorities are required by law to monitor the transfer of these systems to another country, the State Department only recentlyto monitor civilian harm caused by these weapons, in response to .
But enforcement is still a question mark: Josh Paul, a former State Department official,a planned report on the policy’s implementation was canceled because the department wanted to avoid any debate on civilian harm risks in Gaza from U.S. weapons transfers to Israel.
A Skydio spokesperson said the company is currently not aware of any users breaching its code of conduct and would “take appropriate measures” to mitigate the misuse of its drones. A Shield AI spokesperson said the company is confident its products are not being used to violate humanitarian norms in Israel and “would not support” the unethical use of its products.
In response to queries about whether the U.S. government is able to closely monitor high-tech defense platforms sent by smaller companies to Israel or Ukraine, a spokesperson for the U.S. State Department said it was restricted from publicly commenting or confirming the details of commercially licensed defense trade activity.
Some observers point out that the Pentagon derives some benefit from watching new systems tested elsewhere.
“The great value for the United States is we’re getting to field test all this new stuff,” said CSIS’s Cancian — a process that takes much longer in peacetime environments and allows the Pentagon to place its bets on novel technologies with more confidence, he added.
#Israels #appetite #hightech #weapons #highlights #Biden #policy #gap