This document discusses ethical dilemmas related to robotics. It describes how robots are meant to help humans but safety and emotional issues could arise. Robots may operate with partial or full autonomy. Safety is a major concern, and Asimov's Three Laws of Robotics were proposed to ensure robots don't harm humans. However, humans programming robots could misuse them. If robots gain emotions or independent thinking, their rights and responsibilities would need to be addressed.
This document discusses ethical dilemmas related to robotics. It describes how robots are meant to help humans but safety and emotional issues could arise. Robots may operate with partial or full autonomy. Safety is a major concern, and Asimov's Three Laws of Robotics were proposed to ensure robots don't harm humans. However, humans programming robots could misuse them. If robots gain emotions or independent thinking, their rights and responsibilities would need to be addressed.
This document discusses ethical dilemmas related to robotics. It describes how robots are meant to help humans but safety and emotional issues could arise. Robots may operate with partial or full autonomy. Safety is a major concern, and Asimov's Three Laws of Robotics were proposed to ensure robots don't harm humans. However, humans programming robots could misuse them. If robots gain emotions or independent thinking, their rights and responsibilities would need to be addressed.
Ethical Dilemma/s faced by Robotics Ethical Dilemma/s Faced by Robotics • - the idea is to help people and make their lives a lot easier than before • - is to speed up workflow processes in an efficient way. Ethical Dilemmas brought by Robotics • one of the dilemmas faced by robots is safety • another ethical dilemma faced by robots is the emotional component PARTIAL AUTONOMY • - includes active human- robot interaction. FULL AUTONOMY • - excludes active human- robot interaction. • - a robot with full autonomy can perform actions or activities even without a master telling it what should be done or what should be performed next. First Dilemma: Safety Asimov's Laws for Robots • - to ensure the safety of not only the users of technology but also the people around him. • - the service robots only follow what their masters tell them to do with great consideration to the laws formulated by Asimov. • - If the agent using the technology misuses the robot to achieve personal agendas, then without a doubt, the agent should be held accountable for any consequence it may bring. • - When the robots deviates from the laws specified, then the maker or the inventor of the machine should be blameworthy. • - If the machine develops the ability to think for itself, the one that should be blamed can both be the maker or the inventor and the robot itself. Second Dilemma: Emotional Component • - it is just right for the robots to be given their own set of rights should they develop the ability to feel different kinds of emotions. • - they should be treated equally like others and to give them new laws to follow in order to accommodate the new characteristic they have developed