Why Artificial Intelligence Increases the Importance of Humans in War

Recent scholarship on artificial intelligence (AI) and international security focuses on the political and ethical consequences of replacing human warriors with machines. Yet AI is not a simple substitute for human decision-making. The advances in commercial machine learning that are reducing the costs of statistical prediction are simultaneously increasing the value of data (which enable prediction) and judgment (which determines why prediction matters). But these key complements—quality data and clear judgment—may not be present, or present to the same degree, in the uncertain and conflictual business of war. This has two important strategic implications. First, military organizations that adopt AI will tend to become more complex to accommodate the challenges of data and judgment across a variety of decision-making tasks. Second, data and judgment will tend to become attractive targets in strategic competition. As a result, conflicts involving AI complements are likely to unfold very differently than visions of AI substitution would suggest. Rather than rapid robotic wars and decisive shifts in military power, AI-enabled conflict will likely involve significant uncertainty, organizational friction, and chronic controversy. Greater military reliance on AI will therefore make the human element in war even more important, not less.

That is from a new paper by Avi Goldfarb and Jon R. Lindsay, via the excellent Kevin Lewis.

Comments

Comments for this post are closed