+1-518-621-2074 | US-Canada Toll Free Contact Us

The World’s at Risk as Intel, Amazon Roll-out Killer AI: Survey

about us

Published on : Aug 23, 2019

Microsoft, Intel, and Amazon are being counted as the key companies that are risking the world with their killer robot development. On the basis of a survey report of the key players from tech industry were asked about their take on harmful autonomous weapons.

A survey by Dutch NGO Pax listed 50 companies into three major criteria, which are:

  • Whether the technology they were developing is important to deadly AI
  • If they are working on related defense projects
  • Whether they had promised to refrain from investing in future.

The main author of the report, Frank Slipper, questioned these companies about their reason for not denying the development of such controversial weapons, which would take decision to kill human without permission.

In past few years, the working of AI technologies to enable weapon systems to select and attack the targets has spurred the ethical debate. The critics had however warned these companies that it would hamper the international security and instigate another revolution in warfare, trailing atomic bomb and gunpowder.

Last year, Google, published certain guidelines to forego AI for utilization in making of weapons, and was in top seven organizations indulging in the "best practice" of analysis. This also included Japan’s Softbank which is famous for human-like Pepper robot.

 List Includes 22 Medium Concern and 21 High Concern Countries

In the survey, 22 companies were listed under ‘medium concern’ and 21 companies were categorized as ‘high concern’, especially Microsoft and Amazon. These two companies have bid US$10 billion for Pentagon contract which serves cloud platform to the US military.

Another companies in ‘high concern’ category is Palantir. It a based in CIA-supported venture capital firm which was awarded a sum of US$800 million for a contract to create an AI system to help soldiers analyze in war field in real-time.