Personalized robots: the ethical issues

Personalized robots learn about each user’s specific needs and abilities, to adapt their behaviors accordingly (Matarić & Scassellati, 2016). Robots that can adapt and personalize their behaviors come with advantages, but they also raise several ethical issues. This article offers a brief overview on some of the ethical challenges surrounding personalized robots, focusing on privacy, transparency,  over-attachment, and access.

Privacy

Using personalized technologies always carries a privacy risk, as these systems rely on user information and profiling (Toch et al., 2012). Personalized robots collect profiling information for each individual user, to learn about their needs and characteristics and adapt behaviors accordingly. Thereby, robots collect sensitive personal data, entering into more private aspects of users’ lives and thus causing privacy intrusion. In addition to the concerns about privacy intrusion, there are also risks for user personal data security, such as public disclosure and exploitation of collected data for commercial gain. To address these concerns, designers and developers need to prioritize on finding the balance between privacy and personalization when personalizing robots. This balance ought to be found by using privacy-enhancing techniques, such as data minimization and allowing users to choose which data to share with the system (Kobsa, 2007).

Transparency

In regard to robotics, transparency refers to how well users understand robot’s abilities, intentions, and limitations (Wortham et al., 2017). With the increasing complexity and prevalence of technology, it is essential that a technological system is self-explanatory, so that its users understand the actions and aims of the system they interact with (Mueller, 2016). Thus, when personalizing robots, it is important to prioritize explainability in order to ensure that systems are as transparent as possible. Users of personalized robots should understand how the system works, why it uses their data, and how it leads to robot behavioral adaptation. 

Over-attachment

People often tend to develop some kind of emotion or attachment to robots they interact with (Sharkey, 2016). This attachment can have negative effects, like reduction of human contact and social isolation. The problem of over-attachment typically occurs when robots are designed in a deceptive way, such as when their look or behavior is too natural and human-like. This risk is associated with most of social robots, but could be further exacerbated with personalized ones. Namely, interactions with personalized robots can feel more natural and interpersonal, as these robots understand their users and adapt behaviors to them. As a result, users might more easily attach to these robots. Crucially, the higher likelihood of deception and attachment poses greater risk with certain vulnerable groups. These concerns should be given greater attention when designing personalized robotic systems.

Equitable access

Fair access to technological resources is an important ethical principle for technology development (Veruggio, 2006). Experts and stakeholders need to more strongly advocate fair access and distribution, especially when it comes to technologies that are used in areas such as healthcare and education, like personalized robots. Unequal access to technology can exacerbate socio-economic disparities, making it essential to address this issue.

This article has provided a brief reflection on some potential risks and ethical concerns associated with personalized robots. Further critical examination as well as detailed guidance to inform the work of robot designers and developers are needed to address the issues in this emerging field of robotics.

 

 

Kobsa, Alfred. “Privacy-enhanced personalization.” Communications of the ACM 50, no. 8 (2007): 24-33.

Matarić, Maja J., and Brian Scassellati. “Socially assistive robotics.” Springer handbook of robotics (2016): 1973-1994.

Mueller, Erik T. “Transparent Computers: Designing Understandable Intelligent Systems.” Erik T. Mueller, San Bernardino, CA (2016).

Sharkey, Amanda JC. “Should we welcome robot teachers?.” Ethics and Information Technology 18 (2016): 283-297.

Toch, Eran, Yang Wang, and Lorrie Faith Cranor. “Personalization and privacy: a survey of privacy risks and remedies in personalization-based systems.” User Modeling and User-Adapted Interaction 22 (2012): 203-220.

Veruggio, Gianmarco. “The euron roboethics roadmap.” In 2006 6th IEEE-RAS international conference on humanoid robots, pp. 612-617. IEEE, 2006.

Wortham, Robert H., Andreas Theodorou, and Joanna J. Bryson. “Robot transparency: Improving understanding of intelligent behaviour for designers and users.” In Towards Autonomous Robotic Systems: 18th Annual Conference, TAROS 2017, Guildford, UK, July 19–21, 2017, Proceedings 18, pp. 274-289. Springer International Publishing, 2017.