Robots Could Be Given 'Human Rights' As Early As Next Month

Image: Supplied

"Robots' autonomy raises the question of their nature in the light of the existing legal categories – of whether they should be regarded as natural persons, legal persons, animals or objects – or whether a new category should be created, with its own specific features and implications as regards the attribution of rights and duties"

This is from a report out of the European parliament, pushing for a set of laws ensuring the rights and responsibilities of robots and artificial intelligence. The laws, which are to voted on in February, could effectively grant human rights to robots.

The report includes a proposal this uniform definition for "smart robots":

  • acquires autonomy through sensors and/or by exchanging data with its environment (inter-connectivity) and trades and analyses data
  • is self-learning (optional criterion)
  • has a physical support
  • adapts its behaviours and actions to its environment

The report also calls for smart robots to be registered:

For the purposes of traceability and in order to facilitate the implementation of further recommendations, a system of registration of advanced robots should be introduced, based on the criteria established for the classification of robots.

The registration would be managed by an EU Agency for Robotics and Artificial Intelligence.

"Until such time, if ever, that robots become or are made self-aware, Asimov's Laws must be regarded as being directed at the designers, producers and operators of robots, since those laws cannot be converted into machine code," report author, Mady Delvaux, says.

Asimov's Laws were coined in 1943, and are as follows:

  • A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  • A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
  • (3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws
  • and (0) A robot may not harm humanity, or, by inaction, allow humanity to come to harm

But the biggest deal here? Rights for robots. Check this out, direct from the report.

The more autonomous robots are, the less they can be considered simple tools in the hands of other actors (such as the manufacturer, the owner, the user, etc.); whereas this, in turn, makes the ordinary rules on liability insufficient and calls for new rules which focus on how a machine can be held – partly or entirely – responsible for its acts or omissions; whereas, as a consequence, it becomes more and more urgent to address the fundamental question of whether robots should possess a legal status;

Ultimately, robots' autonomy raises the question of their nature in the light of the existing legal categories – of whether they should be regarded as natural persons, legal persons, animals or objects – or whether a new category should be created, with its own specific features and implications as regards the attribution of rights and duties,including liability for damage;

Woah.

A "code of conduct" for those producing robots is proposed, as is a requirement for companies to report how exactly robots and AI are contributing to the bottom line of the business.

The question of liability for the actions of AI is also addressed: "An obligatory insurance scheme, which could be based on the obligation of the producer to take out insurance for the autonomous robots it produces, should be established," the report reads. "The insurance system should be supplemented by a fund in order to ensure that damages can be compensated for in cases where no insurance cover exists."

On autonomous vehicles, the reports says the automotive sector "is in most urgent need of European and global rules to ensure the cross-border development of automated vehicles so as to fully exploit their economic potential and benefit from the positive effects of technological trends".

Delvaux warns that the current "fragmented regulatory approaches" would hinder implementation and jeopardise competition.

"Current private international law rules on traffic accidents applicable within the EU do not need urgent modification to accommodate the development of autonomous vehicles," Delvaux says, "simplifying the current dual system for defining applicable law (based on Regulation (EC) No 864/2007 of the European Parliament and of the Council and the 1971 Hague Convention on the law applicable to traffic accidents) would improve legal certainty and limit possibilities for forum shopping"

The vote is happening late next month, and needs to be unanimous to pass.

Trending Stories Right Now