Identifying Ethical Issues of Human Enhancement Technologies in the Military


  1. Girling, K
  2. Thorpe, J.
  3. Auger, A.
Corporate Authors
Defence Research and Development Canada, Ottawa Ont (CAN) Corporate Office
Defence and Security organizations depend on science and technology approaches to meet operational needs, predict and counter threats, and meet the increasingly complex demands of modern warfare. Rapid advances in science and technology for Human Enhancement (HE) could provide potential solutions to a wide range of military gaps and deficiencies. However, the unique nature of these tools may challenge existing policies, laws and values, and can introduce complicated ethical issues with their use, leading to policy gaps that impede their evaluation and adoption by the Canadian Armed Forces. Considering the potential ethical issues raised by military HE early in development is critical to safeguard the timely, safe and effective implementation of these tools within our forces and ensure that we can adequately prepare for the potential use or exploitation of HE technologies by adversaries. Although generous research exists on military HE and ethics, there is an urgency for improved knowledge of the specific ethical questions that may be raised by individual HE technologies within an operational setting. In the current report, we identify and describe a sample of 34 emerging HE technologies of potential utility to the future army. Using this dataset, we identify the potential military utility of HE technologies over three broad categories: physiology, computation/cognition and automation/robotics. Herein, we also have generated a novel ethics assessment framework to facilitate th

Il y a un résumé en français ici.

Human enhancement;Ethics;Policy;S&T;human optimization;human effectiveness;technology assessment;emerging technologies
Report Number
DRDC-RDDC-2017-R103 — Scientific Report
Date of publication
01 Oct 2017
Number of Pages
Electronic Document(PDF)

Permanent link

Document 1 of 1

Date modified: