izpis_h1_title_alt

Vidiki kazenske odgovornosti za dejanja robotov
ID Gregorič, Maruša (Author), ID Završnik, Aleš (Mentor) More about this mentor... This link opens in a new window

.pdfPDF - Presentation file, Download (1,09 MB)
MD5: F97285A9C9637F8413E0CC6C3830356C

Abstract
Robote že dolgo uporabljamo na področju proizvodnje, sedaj pa postajajo tudi vedno bolj nepogrešljivi v vsakdanjem življenju. Bolj avtonomni in napredni kot so, bolj je pomembno korenito razmisliti o njihovem pravnem statusu ter o vidikih kazenske odgovornosti za njihova dejanja. Na svetovni ravni so se začele pojavljati pobude po boljši ureditvi pravnega statusa robotov, ki so v pravu trenutno obravnavani kot stvari. Možnosti so, da bi robote obravnavali kot pravne osebe ali da bi vzpostavili nov pravni subjekt v obliki e-osebnosti. Lahko bi tudi posnemali pravni status živali, ki niso ne pravni subjekti, niti ne le stvari. Nasprotniki teh pobud menijo, da je vzpostavitev novega sistema še preuranjena in bi povzročila več možnosti zlorab. Tudi zaradi navedenega se pojavljajo kritike podelitve državljanstva Savdske Arabije robotu Sophia, proizvajalca Hanson Robotics. Roboti že sedaj včasih povzročijo neželene posledice, recimo poškodbe ali smrt delavcev v proizvodnji, zato je pomembno opredeliti, kdo je kazensko odgovoren za njihova dejanja. Pravo trenutno določa, da so lahko storilci kaznivih dejanj le fizične in pravne osebe. Kazenska odgovornost za dejanja robotov je pogosto omejena na tistega, ki je robota uporabil kot sredstvo za izvršitev njegove kriminalne zamisli. Naloga razišče alternativne možnosti kazenske odgovornosti za dejanja robotov, kot recimo da bi bil robot sam odgovoren za svoja dejanja. Za navedeno je potrebno več predpostavk, kot so, samostojno sprejemanje moralnih odločitev, sposobnost samostojnega ravnanja in spodobnost sporočanja svojih odločitev na razumljiv način. Če bi pravo dopuščalo možnost, da je robot sam odgovoren za svoja dejanja, sicer obstaja nabor primernih in možnih kazenskih sankcij za robote, vendar so smiselne le z vidika generalne prevencije, ne pa tudi z vidika specialne prevencije.

Language:Slovenian
Keywords:Roboti, umetna inteligenca, pravna subjektiviteta, kazenska odgovornost robotov, kaznovanje robotov
Work type:Master's thesis/paper
Organization:PF - Faculty of Law
Year:2021
PID:20.500.12556/RUL-129549 This link opens in a new window
COBISS.SI-ID:75149571 This link opens in a new window
Publication date in RUL:04.09.2021
Views:808
Downloads:126
Metadata:XML RDF-CHPDL DC-XML DC-RDF
:
Copy citation
Share:Bookmark and Share

Secondary language

Language:English
Title:Aspects of criminal liability for the actions of robots
Abstract:
Robots have long been used in manufacturing and it is hard to imagine daily life without them. The more autonomous and advanced they become, the more important it is to think about their legal status and the aspects of criminal liability for their actions. Initiatives have formed around the world to further regulate the legal status of robots, which are currently treated as objects. The possibilities are to treat robots as legal persons or to establish a new form of legal entity, such as e-personality. One possibility is also a similar legal status to animals, which are neither legal entities nor mere objects. Opponents of these initiatives believe that establishing a new system is premature and would lead to more opportunities for abuse. This is one of the reasons for criticizing the granting of Saudi citizenship to the robot Sophia, manufactured by Hanson Robotics. Robots already sometimes cause undesirable consequences, such as injury or death to production workers, so it is important to define who is criminally liable. The law currently states that only humans and corporations can be held liable for criminal acts. Criminal liability for the actions of robots is often limited to the person who used the robot as a means to achieve their criminal aim. This dissertation explores alternative possibilities for criminal liability for the actions of robots, such as robots themselves being held liable for their own actions. This would require some preconditions, such as that they make moral decisions on their own, act independently, and communicate their decisions to humans in an understandable way. If it were legally possible to hold them criminally liable for their actions, there would be a reasonable range of criminal sanctions for robots, but they would only make sense from the standpoint of general prevention and not from the point of view of special prevention.

Keywords:Robots, artificial intelligence, legal personhood, criminal liability of robots, punishing robots

Similar documents

Similar works from RUL:
Similar works from other Slovenian collections:

Back