DBT Bureau
Bengaluru, 24 August 2024
Some employees of Google’s AI division, DeepMind, has urged the search giant to terminate its contracts with military organisations as it contradicts with Google’s AI (artificial intelligence) policy.
According to a report in TIME magazine, around 200 employees have signed a letter calling on the company to terminate its contracts with military organisations.
The report mentioned that around 200 workers of the company signed the letter on May 16, which indicated the growing concern within the AI lab about the ethical implications of using their technology for military purposes.
These concerns have surfaced as some reports suggested that AI technology developed by the lab is being made available to military organisations through Google’s cloud contracts.
According to the report by Time, Google’s contracts with the US military and the Israeli military grant these organisations access to cloud services that may include DeepMind’s AI technology. Such revelation has sparked significant ethical concerns among DeepMind employees, who feel that such contracts violate the company’s commitment to ethical AI development.
However, the Time report showed that Google has not provided any substantial response to the employees’ demands despite these concerns. The report quoted four unnamed employees, who said they have received no meaningful response from leadership and are growing increasingly frustrated.
In response to the report, Google stated that it complies with its AI principles and that its contract with the Israeli government is not directed at highly sensitive, classified, or military workloads relevant to weapons or intelligence services. However, the partnership with the Israeli government has come under increased scrutiny in recent months.
DeepMind, a top-secret AI lab under Google’s umbrella, is known for its cutting-edge advancements in artificial intelligence. It was acquired by Google in 2014 as the tech giant bets big like on the emerging technology to stay ahead of its competitors.