Skip to content Go to main navigation Go to language selector
Saab Global
adobestock_230414747.jpeg

The rights and wrongs of autonomous systems

5 min read

Autonomous systems such as smart robots and self-driving vehicles are becoming features of our lives. But how do we ensure they’re used – and developed - responsibly? In the second episode of the Zero Pressure podcast, Dr Helen Sharman speaks to Professor Virginia Dignum and Associate Professor Denise Garcia about the rights and restrictions that should apply to autonomous systems and autonomous weapons.

Autonomous systems can be of great benefit for society, so long as humans take responsibility for them. And scientists and medium-sized powers have vital roles to play in creating trust and cooperation over how to regulate autonomous weapons.

These are two of the key messages to emerge from the intriguing second episode of the Zero Pressure podcast from Imperial College London and Saab, a relaxed conversation with people on the cutting edge of science and technology that’s hosted by the UK’s first astronaut, Helen Sharman.

The systems aren’t responsible – we are

Autonomous systems are freeing us from repetitive and time-consuming tasks on the factory floor, the office and the household. But the advent of self-driving cars and other autonomous vehicles makes some nervous about whether we are losing control and letting machines make decisions for us.

For Virginia Dignum, Full Professor of Responsible Artificial Intelligence at the University of Umeå in Sweden, the risks of autonomous systems should always be offset by the fact that they act according to information and instructions from humans.

“Autonomous systems are software systems that have been programmed, that can take decisions based on the environment without interaction from others. And they’ve actually been around for a long time – your thermostat is an example,” she explains.

“It’s important to realise that the system itself has no understanding. It’s just a tool - based on information that has been programmed into it.”

From that standpoint, Professor Dignum tells Helen Sharman that concerns over autonomous systems thinking for themselves and slipping from our control are looking at the issue from the wrong angle.

“We need to avoid the narrative that just because the system is autonomous, responsibility is difficult to determine. It’s not. Someone decides to use an autonomous system that someone else has made. The system itself is only one part of an ecosystem of decisions and production.”

She advocates education and training to help us be aware we can ask questions and realise that we can still affect outcomes, rather than abdicating responsibility.

But the positives outweigh the negatives.

“Part of the responsibility we have is using the system when we need to use it,” says Professor Dignum.

“It’s easy to fall into talking about the negatives of these systems and what can go wrong with them, but I think one of the most important things to realise is that the worst that can happen is to not use autonomous systems when they can be beneficial for us. These types of systems can potentially solve extremely complex issues that we cannot solve by ourselves.

“We need these artefacts to enhance our capability of dealing with complexities, such as climate change and the pandemic.”

The importance of regulation

In contrast to Professor Dignum’s confidence, Denise Garcia, Associate Professor of Political Science and International Affairs at Northeastern University in the US city of Boston, is much more cautious.

Professor Garcia researches on international law and the questions of lethal robotics and artificial intelligence. Her main concern is that the rise of autonomous weapons systems such as drones create a distance between the humans behind them and the responsibility for the actions these machines carry out. 

“When targets are set in a war, generals usually base them on military doctrine in combination with legal advisors who know about international humanitarian law - the laws of war.

“But there are concerns that autonomous weapons can’t distinguish between civilians and combatants, such as in the use of drones, which are lowering the threshold for going to war and putting human rights in peril.”

For Professor Garcia, international law is “that last beacon of humanity in warfare”, which prohibits weapons such as chemical weapons and landmines. She firmly disagrees with the idea that autonomous weapons could be a way to propel us towards full disarmament.

Hopeful signs

However, Professor Garcia is also optimistic about the world’s capacity for regulating this technology.

“Small and medium-sized powers such as Sierra Leone, Brazil, Chile, Austria, Germany, Canada are finding ways to work on things that are hardest to move along the political spectrum,” she says.

“And if we look at the international treaty on the Ozone layer, the International Space Station or the response to the Covid-19 pandemic, we can see how we can put our trust in scientists and diplomats to put aside rivalries to cooperate with others and resolve issues that affect us all.”

A ‘must-hear’ podcast

This second podcast in Imperial College London and Saab’s Zero Pressure series is a ‘must-hear’, filled with stimulating debate and discussion. It’s available on most podcast platforms including Spotify, Google and Apple.

Your questions, comments and suggestions for future discussions are welcome. Follow the series on your favourite podcast platform and via Twitter to keep up-to-date and stay involved!