- Get link
- X
- Other Apps
Scientist Alan Winfield, a professor of roboetics at the University of the West of England in Bristol, and Mariana Dzirotka, a professor of anthropocentric programming at Oxford University, believe that robots should be equipped with a so-called “ethical black box”. It will become the equivalent of a recording device used in aviation to find out all the successive reasons forcing pilots to take certain actions in case of emergencies, which allows investigators to follow these actions and find out the causes of the disaster.
As robots increasingly began to leave industrial factories and laboratories and increasingly begin to interact with humans, the importance of increased security measures in this case can be fully justified.
Winfield and Dzhirotka believe that companies involved in the production of robots should take the example offered by the aviation industry, which owes its security not only to technology and reliable assembly, but also to strict adherence to security protocols, as well as a serious level of accident investigation. It was this industry that offered black boxes and pilot cockpits equipped with recording devices that, in the event of incidents, allow people involved in aircraft crash research to find the true cause of these events and draw vital lessons from it to improve safety and prevent similar incidents in the future.
“Serious cases require serious investigations. But what will you do if the investigator discovers that at the time of the incident with the robot, no internal records of the events occurred were kept? In that case, it’s almost impossible to say what really happened, ”Winfield commented to The Guardian.
The ethical black boxes used in the context of robotics will record all decisions, a chain of causal relationships for making these decisions, all movements and data from the sensors of their carrier. Having a black box with recorded information will also help robots in explaining their actions in a language that human users can understand that it will only strengthen the level of interaction between humans and machines and improve this user experience.
Winfield and Djirotka are not the only experts concerned with ethical issues around artificial intelligence (AI). Missy Cummings, a specialist in unmanned aerial vehicles and director of the Laboratory for the Study of Human Interactions and Automation Systems at Duke University in North Carolina (USA), said in an interview with the BBC in March that AI surveillance is one of the most important problems for which no solution has been found.
“To date, we do not have clearly derived instructions. And without the introduction of industry standards for the development and testing of such systems, it will be difficult to bring these technologies to a wide level, ”Cummings commented.
In September 2016, companies such as Amazon, Facebook, Google, IBM and Microsoft formed the partnership "Partnership for Artificial Intelligence to Benefit People and Society". The main tasks of this organization is to ensure that the development of AI is conducted honestly, openly and ethically. In January of this year, Apple joined the organization. After that, many other technology companies also expressed such a desire and joined the union.
At the same time, the outreach-charitable organization Future of Life Institute (FLI) created the Asilomar AI Principles - a basic body of laws and ethical standards for robotics, designed to be confident in the reliability of AI for the future of humanity. FLI was founded by companies, organizations and institutes such as DeepMind and MIT (Massachusetts Institute of Technology), and its scientific advisory board includes such individuals as Stephen Hawking, Frank Wilczek, Ilon Mask, Nick Bostrom and even Morgan Freeman, the famous American actor .
In general, if one agrees that proactive ideas, combined with the hard work of the industry’s most acute minds, are the best defense against any probable problems associated with AI in the future, then one can say that humanity is already under reliable protection.
The article is based on materials
- Get link
- X
- Other Apps
Comments
Post a Comment