Is the University of Edinburgh involved in research for autonomous weapons?

Partnerships between the military and UK academia are being rapidly expanded. Alba Andrés Sánchez examines whether some of them might be used to help develop lethal autonomous weapon systems (LAWS).

Article from Responsible Science journal, no.4; online publication: 15 September 2022
 

Download pdf of article
 

The University of Edinburgh is currently working alongside three other universities in a consortium called the University Defence Research Collaboration (UDRC). [1]  This is a multi-million-pound joint venture funded by the Ministry of Defence (MOD) and the Engineering and Physical Sciences Research Council (ESPRC) aimed at using academic research to boost military capabilities. The EPSRC also funds other networks and initiatives aimed at promoting co-operation between UK universities and the military in areas relating to artificial intelligence (AI) and data processing. [2]
 

Projects with relevance to LAWS

One of the main projects being carried out by the UDRC – and led by Edinburgh – is ‘Signal Processing in the Information Age’ which has a value of nearly £4.1m over six years. [3]  The research description includes deep learning, suggesting that the outcomes can be used for intelligence gathering, target detection, and recognition and tracking – these being critical functions of armed drones. The likely application of this research in a military setting is emphasised by the involvement of project partners including BAE Systems, Leonardo, and Thales – all leading arms corporations which have an active interest in autonomous systems. Moreover, the MOD has direct access to an academic signal processing pool deployable on short notice, raising many ethical questions and highlighting the militaristic application of the research outcomes. Hence, there are clear indicators that this project can assist with the development of LAWS.

Another project led by Edinburgh is the ‘UKRI Trustworthy Autonomous Systems Node in Governance and Regulation’, which has a value of over £2.6m over 3.5 years. [4]  It is studying trust in autonomous systems especially related to the creation of regulatory structures, and issues of responsibility and liability. [5]  Some of its project partners are also leading developers of military technology with an interest in autonomous systems – BAE Systems, Thales, and the Defence Science and Technology Laboratory. Professor Ramamoorthy, the lead researcher, is also Personal Chair of Robot Learning and Autonomy in the School of Informatics and has a history in developing autonomous robots. [6]  There is no mention of any ethics- or law-focused academics participating in this project, which raises questions about whether these aspects will be given due weight in the research, or whether there will be a bias towards favouring laxer regulations for autonomous systems development driven by, for example, commercial pressures.

The University of Edinburgh also has some involvement with the BAE Systems project Tempest [7] which has the aim to develop combat aircraft including ‘autonomous systems’. The MOD has stated that there are more than 600 organisations working on this project, including small businesses and academia. Partnerships involve organisations both inside and outside the military-industrial sector, [8] and the whole collaboration is called Team Tempest. Edinburgh’s involvement is shown by its three-way research partnership with Heriot-Watt University and Leonardo – but the university website does not mention this project in any way (neither its funding, areas of research, nor the academics involved). This lack of transparency is deeply concerning. The partnership also extends to the Centres for Doctoral Training in Applied Photonics as well as in Robotics and Autonomous Systems. [9]  The latter forms part of the university’s Edinburgh Centre for Robotics, which offers funded PhD training programmes. The University of Edinburgh’s website states that these programmes are sponsored by the EPSRC [10] but the links to BAE Systems and Leonardo are not disclosed. There is also no mention of how the research outcomes will be utilised to develop autonomous aircraft for BAE Systems. Therefore, candidates applying to this opportunity will be unaware of the connections of research outcomes to arms companies and the potential development of LAWS.
 

Student concern

Following a motion from Edinburgh University Amnesty International representatives, which passed in the Student Council in January 2021, the university’s Student Association signed the Future of Life Institute pledge on LAWS. [11]  The Student Council then put pressure on the University to sign the pledge but this was rejected in March following objections by the College of Science and Engineering due to the alleged dual-use nature of the research outcomes, i.e. their ability to be used for both civilian and military applications. However, the university does not have a comprehensive Research Ethics Policy which deals adequately with issues such as autonomous weapons. If the university’s partnerships with arms companies and the MOD are considered as well, the argument that further ethical safeguards are not needed becomes especially unconvincing.

The rapid pace of technological development in AI and robotics is raising major ethical issues, not least concerning the development of autonomous weapons. Universities – like Edinburgh – which pursue military-funded research in these areas, without much stricter safeguards, risk helping to fuel an international arms race which will endanger us all.
 

Alba Andrés Sánchez is a student of International Relations at the University of Edinburgh and a Junior Fellow for UK Stop Killer Robots Campaign.
 

References

[1] UDRC (2018). About UDRC. https://udrc.eng.ed.ac.uk/about

[2] Burt P (2018). Off The Leash: The development of autonomous military drones in the UK. Drone Wars UK. https://dronewars.net/wp-content/uploads/2018/11/dw-leash-web.pdf

[3] EPSRC (2018). Grants on the Web: EP/S000631/1.  https://gow.epsrc.ukri.org/NGBOViewGrant.aspx?GrantRef=EP/S000631/1

[4] EPSRC (2020). Grants on the Web: EP/V026607/1.  https://gow.epsrc.ukri.org/NGBOViewGrant.aspx?GrantRef=EP/V026607/1

[5] EPSRC (2020) – as note 3.

[6] University of Edinburgh (undated). Robust Autonomy and Decisions Group. http://rad.inf.ed.ac.uk

[7] Squires M (2021). Tempest: Innovation for UK security and prosperity. Team Tempest.

[9] Squires (2021) – as note 6.

[10] University of Edinburgh (2022). EPSRC Centre for Doctoral Training in Robotics and Autonomous Systems: PhD. https://www.ed.ac.uk/studying/postgraduate/degrees/index.php?r=site/view&id=863

[11] Future of Life Institute (undated). LAWS pledge. https://futureoflife.org/2018/06/05/lethal-autonomous-weapons-pledge/

drone (iStockphoto)

Filed under: