(THIS ARTICLE IS MACHINE TRANSLATED by Google from Norwegian)
"Arming artificial intelligence raises a number of concerns [- and may trigger a new arms race," Guterres said. He called for increased efforts and referred to ongoing work with the UN Convention on Inhuman Weapons, where a ban on fully autonomous weapons systems can be worked out.
Precedent of prohibition
The UN Convention on Inhuman Weapons (CCW) entered into force in 1983 as an addition to the Geneva Conventions. The Convention is an international disarmament agreement and consists of five protocols. Protocol I applies to splinter bombs that cannot be detected by X-ray. Protocol II regulates the use of land mines (the protocol works in parallel with the Ottawa Convention, which prohibits the use of land mines.) Protocol III applies to firearms, Protocol IV blinding laser weapons, and Protocol V cleanup of explosive fragments following war actions. 103 states have ratified the convention.
The ratifying states recognized early on that international law could be enhanced in the face of new technology. When Special Rapporteur Christof Heyns in 2013 warned the UN Human Rights Council of the development of autonomous, CCW therefore became the natural place to start the conversation. From 2014 to 2016, informal talks were held in Geneva. This paved the way for the formal process initiated in 2017, where technical solutions could be negotiated. So far, two technical solutions have been put on the table: a proposal for a ban, promoted by Austria, Brazil and Chile, and a proposal to negotiate a political statement, promoted by Germany and France. The latter proposal is weaker, but has been endorsed by several European states and can be seen as a compromise between the states that have not wanted regulation and those to date, 28 states that have wanted a ban.
Each year, the CCW countries hold their annual State Party meeting, where they submit the progress plan for next year. In November, we saw the United States, Israel, South Korea, Australia and Russia blocking proposals for both bans and political declarations. For the first time, less meeting time was set for 2019. A poor result, which highlights the weakness of CCW's decision-making process.
Norway's Peace Team is a member of the international Campaign to Stop Killer Robots, where reputable disarmament players such as Human Rights Watch, Article 36, Mines Action Canada, Pugwash, PAX and Amnesty International, to name a few, are struggling to ban fully autonomous weapons systems.
Considering other arenas
"This year, probably states that want a ban will consider other arenas for this UN process, as we saw with the landmine and cluster weapons process," said Mary Wareham, head of Campaign to Stop Killer Robots, before Christmas. States that have not concluded their policies should therefore make good use of the 2019 assembly line.
Norway has long been on the sidelines in this UN process. In June 2018, therefore, Knut Arild Hareide (KRF) sent a written question to Foreign Minister Ine Marie Eriksen Søreide: "What is the Government doing to help develop new political and legal regulations to ensure meaningful human control over these types of weapons?" that Norway will now "take an active role in the discussion of fully autonomous weapons, both inside and outside the CCW, to ensure that basic principles of human control and compliance with humanitarian law are applied". But in August 2018, during this year's second formal meeting, the Norwegian delegation announced that they still had no mandate to conclude on politics. This is despite the fact that the delegation saw increasing agreement among CCW countries on the need to maintain meaningful human control over the critical functions of weapons systems.
Fears great power rivalry
Colonel Gjert Lage Dyndal, Deputy Head of SAC (Strategic Analysis Capability), agrees that Norway has been on the sidelines for too long. Dyndal works with analysis of technology that will affect NATO and fears great power rivalry over autonomous weapons: "As a small state, we should be concerned with international regulation of something that can affect us as much as autonomous weapons," Dyndal told Ny Tid.
André Pettersen, head of research at the Norwegian Defense Research Institute, reflected Dyndal's concern in the P2 Securities Exchange on 2 January: "It will be difficult for the Norwegian Defense to say that 'no, we will not be able to have any of these systems without everyone else saying it' ... ], so it is true that those who are properly in this, discuss where the boundaries go, set them and get as many people as possible to join them. And those boundaries must then be discussed in relation to humanitarian law. ”
Autonomous weaponsA fully autonomous weapon is an unmanned weapon, controlled by a machine of artificial intelligence, which identifies and attacks living and static targets without human interference. Autonomous weapons have therefore been nicknamed killing robots. The term emphasizes the need to maintain so-called meaningful human control over the critical functions of the weapons system.
• In a new Ipsos survey commissioned by Campaign to Stop Killer Robots, 3 out of 5 (61 percent) respond that they oppose the use of autonomous weapons systems.
• 6 countries participated in the survey, including Germany, France, the United Kingdom, Russia, China and Sweden. The strongest was the resistance in Turkey, South Korea and Hungary.
• Among the opponents, 66 per cent believe that with the use of autonomous weapons systems, a moral limit is crossed. 54 percent oppose the use of autonomous weapons systems because of the irresponsibility this entails.
In line with Guterres, the International Committee of the Red Cross now says it is urgent to set limits on autonomy in weapons systems. It also has 22 robotics researchers and developers of artificial intelligence signed in an open letter published July 000 last year. Last year, we also saw the big Google call, which resulted in Google withdrawing from Project Maven – the Pentagon contract to equip artificial intelligence drones. In retrospect, Google presented a new ethical framework, with principles for developing technologies that violate international humanitarian law.
In the Norwegian Petroleum Fund's Council on Ethics, the head of the Council on Ethics Johan H. Andresen and member Cecilie Hellestveit are closely following developments. Andresen already expressed his concern in 2014 that companies in the fund's portfolio that are involved in the development could emerge.
Should be Norway's big cause
Control of fully autonomous weapons should become Norway's major concern in international arms control efforts in the years to come. Our foreign policy cannot be formed in a vacuum. As a NATO member, we are in a tension between the United States and Russia, which is why international institutions and international law are our first line of defense. We need stability. On several occasions, Norway has emerged as an independent and proud driver of disarmament, for example in its efforts to ban anti-personnel mines and cluster munitions. The work has given Norway attention as a peace nation and a position we benefit from. Thus, the struggle for ideals is a typical small state interest. In a time of growing intergovernmental suspicion, Norway in particular has much to gain from being active in a process like this.
Also read: When will Norway get on the field?