Robots have long been used in manufacturing and it is hard to imagine daily life without them. The more autonomous and advanced they become, the more important it is to think about their legal status and the aspects of criminal liability for their actions. Initiatives have formed around the world to further regulate the legal status of robots, which are currently treated as objects. The possibilities are to treat robots as legal persons or to establish a new form of legal entity, such as e-personality. One possibility is also a similar legal status to animals, which are neither legal entities nor mere objects. Opponents of these initiatives believe that establishing a new system is premature and would lead to more opportunities for abuse. This is one of the reasons for criticizing the granting of Saudi citizenship to the robot Sophia, manufactured by Hanson Robotics. Robots already sometimes cause undesirable consequences, such as injury or death to production workers, so it is important to define who is criminally liable. The law currently states that only humans and corporations can be held liable for criminal acts. Criminal liability for the actions of robots is often limited to the person who used the robot as a means to achieve their criminal aim. This dissertation explores alternative possibilities for criminal liability for the actions of robots, such as robots themselves being held liable for their own actions. This would require some preconditions, such as that they make moral decisions on their own, act independently, and communicate their decisions to humans in an understandable way. If it were legally possible to hold them criminally liable for their actions, there would be a reasonable range of criminal sanctions for robots, but they would only make sense from the standpoint of general prevention and not from the point of view of special prevention.
|