안보

Microsoft admits weaponizing AI in Gaza war, ‘for Hamas hospital attack’

김종찬안보 2025. 5. 17. 14:05
728x90

 

Microsoft admits weaponizing AI in Gaza war, ‘for Hamas hospital attack’

 

Microsoft on Thursday posted a blog post on its website acknowledging that it sold advanced artificial intelligence (AI) and cloud computing services to the Israeli military during the Gaza war and helped in efforts to find and rescue Israeli hostages.

The attack on Palestinians in Gaza by AI military technology provided by Microsoft appears to have been the main culprit behind Israel’s indiscriminate ground strikes on underground tunnels, claiming to use hospitals and civilians as shields in its attacks on Hamas headquarters.

“We are in an incredible moment where corporations, not governments, are dictating terms of use to governments that are actively involved in a conflict,” Emelia Probasco, a senior fellow at Georgetown University’s Center for Security and Emerging Technologies, told the AP about the case of private U.S. companies providing military technology to Israel’s Hamas attacks. “It’s like a tank manufacturer telling another country that they can use our tanks only for ‘this specific reason’ and then exporting them.” 

The MS statement acknowledged the “military use” and said it had “no visibility into how customers use our software on their own servers or other devices,” but denied responsibility for its “inability to verify Israeli military use” by saying it had no way of knowing how its products might be used by other commercial cloud providers.

In addition to Microsoft, the Israeli military has extensive contracts with Google, Amazon, Palantir, and several other major US tech companies for cloud or AI services.

Microsoft said only that “the Israeli military, like all of its customers, is required to follow the company’s Acceptable Use Policy and AI Code of Conduct, which prohibits the use of its products to cause harm in a manner prohibited by law,” and that “the company has found no ‘evidence’ that the Israeli military has violated those provisions.” 

In a statement, Microsoft said it had provided the Israeli military with software, professional services, Azure cloud storage, and Azure AI services (including language translation), and worked with the Israeli government to protect its cyberspace from external threats. 

The statement acknowledged the “military attack” by saying it had provided Israel with “special access to our technology beyond the terms of a commercial contract” and “limited emergency assistance” as part of its efforts to rescue more than 250 hostages kidnapped by Hamas on October 7. 

The AP, which first reported the AI-powered attacks on civilian facilities in Gaza, said on the 16th that “the AP investigation revealed previously unreported details of a close partnership between the U.S. tech giant and the Israeli Defense Forces, as military use of commercial AI products surged nearly 200-fold following the deadly Hamas attack on Oct. 7, 2023.”

 The AP reported that the Israeli military was using Azure to transcribe, translate and process information collected through mass surveillance and then cross-check it with Israel’s in-house AI-powered targeting systems, and vice versa. Human rights groups have raised concerns that flawed and error-prone AI systems are being used to determine who or what to target, resulting in the deaths of innocent people.

The U.S.-backed partnership reflects a growing push by tech companies to sell AI products to militaries for a wide range of uses, including in Israel, Ukraine and the United States. The AP specifically stated that “Microsoft declined to respond to written questions from the AP about how its AI models helped the military translate, classify and analyze information used to select targets for airstrikes,” and that “Microsoft’s statement did not directly answer several questions about how exactly the Israeli military is using the technology, and the company declined to comment further.”

 

On February 18, AP’s global investigative reporting team (investigative@ap.org) reported on the massacre of civilians in Gaza using AI under the title, “As Israel uses U.S.-made AI models in war, concerns are raised about the technology’s role in determining who lives and who dies.” 

The AP reported that “Big American tech companies have been secretly empowering Israel to track and kill more militants faster in Gaza and Lebanon through a surge in artificial intelligence and computing services, and the sharp increase in civilian deaths has raised fears that these tools are contributing to the deaths of innocent people.” The AP also reported that “the military has hired private companies to develop customized autonomous weapons for years, and Israel’s recent war is a major example of the active use of commercial AI models made in the United States despite concerns that they were not originally developed to help determine who lives and who dies.”

 

The AP reported that Microsoft’s internal intelligence investigation found that the Israeli military’s use of Microsoft and OpenAI AI had increased nearly 200-fold in March compared to the week before the Oct. 7 attacks three years ago, and that the amount of data stored on Microsoft servers since the beginning of the Gaza war doubled to more than 13.6 petabytes by July 2024, or about 350 times the digital memory needed to store all the books in the Library of Congress.

 

The AP cited the active use of Microsoft's massive computer servers in the Gaza War as evidence that "the military's use of Microsoft's massive computer servers also increased by nearly two-thirds in the first two months of the war alone." Israel used new AI-infused military technology to find the target of an assassination of a Hamas commander, whose location had been difficult to track despite its superior eavesdropping technology, and killed him in an airstrike on September 31, 2023, killing 125 civilians in the vicinity. 

The Israeli military developed a new AI tool similar to ChatGPT and trained it on millions of Arabic conversations obtained through surveillance of Palestinians in the occupied territories, and Unit 8200 of the government bureau intercepted countless text messages, transcribed phone calls, and accumulated Arabic dialect posts scraped from social media for decades in Palestine, including the Gaza Strip, and built a large-scale language model LLM in the early months of the Gaza War to build a chatbot that could run queries in Arabic, and merged this tool with a multimedia database to enable intelligence analysts to perform complex searches on images and videos to track underground locations that failed in wiretapping.

 

Lee Jae-myung, the Democratic Party candidate for president, announced “reducing polarization with AI” and “fair opportunity results” in a TV debate on the 25th.

On the 22nd, the candidate had an exclusive conversation with Professor Israel Harari at the National Assembly, saying that “conflicts in the world come from inequality” and “solving the inequality gap with AI.”

 

The AP reported on the 17th, shortly after President Trump's Middle East tour ended, that "the crisis in Gaza has reached one of its darkest moments as Israel blocks all food and supplies from entering the territory and continues its intense bombing campaign." "Humanitarian officials warn that famine will engulf the region, and doctors say there are no medicines to treat common ailments. But Israeli leaders are threatening a much more powerful ground attack," and reported that "500,000 people in Gaza are on the verge of starvation and 1 million are in extreme hunger."

<AI Technology Israel to Attack Hamas, Lee Jae-myung 'Relieve Polarization with AI', April 26, 2025>