Personalization and customization: what if the role of the user in the adaptation changes how we perceive and interact with robots?

Personalization is a well-known process that plays a role in product or interface design, and consumer psychology. The adaptation of both appearance and functionalities of a technology or a service to users’ preferences and needs has gained prevalence, even in the context of social robots. However, while the technical advances that enable the personalization of robots to their users are growing fast, the current theoretical understanding of the effects of such adaptive processes on how the users perceive and interact with robots is still limited. Existing research suggests that people have better attitudes toward robots that autonomously adapt to their users and they tend to trust them more. However, this research does not yet explain why such positive effects occur. Thus, such lack of a theoretical explanation is problematic in the sense that it compromises any predictions regarding the impact of adaptation on positive attitudes and trust toward robots. The lack of clear-cut theoretical considerations also prevents robot designers to determine how personalization should be implemented, which features should be personalized, and whether the effects of personalization potentially as a function of user characteristics such as culture, gender, age, psychological traits, etc.

Attitudes and trust are both important factors of technology acceptance and adoption. Moreover, trust is an crucial determinant of the use of a robot. Indeed, if you trust a robot too much, you may rely too much on it and use it in a way that may outperform it, leading to failures or even accidents. On the other hand, if you don’t trust a robot enough, you may just not use it after all. One of the key psychological mechanisms that facilitate positive attitudes and trust toward personalized robots is personal relevance. Personal relevance refers to the perceived matching between a target and one’s preferences, needs, values and goals. That is, if a robot adapts its characteristics to a user, this user may perceive it as more personally relevant and he*she might consequently evaluate the robot more positively than if the robot hadn’t been personalized. A positive evaluation, in turn, may increase user trust in the robot.

However, one could argue that there will always be people who have negative perceptions of personalized technologies, who distrust them and who reject to use them in their daily lives. This might be due to the fact that some of them are worried by privacy concerns or concerned about lacking knowledge or control over the technology. Research about personalization of products and services has already identify these issues and concluded that, indeed, the content of personalization (i.e., what is personalized) would be important, but the role of the user in the adaptation process may be deemed even more important. From this followed the distinction between system-driven adaptation, also referred to as personalization, and user-driven adaption, also referred to as customization. 

And that may be the most important blind spot in the research about adaptation to the user in HRI: Such work -thus far – has mainly focused on personalization. That is, with a focus on the robot performing the adaptation without involving the user in this adaptation process. This makes sense if we consider that robot developers usually aim to make robots more autonomous to limit required human efforts into configuring, supervising or helping the robot. But neglecting the involvement of the users in the adaptation process expose us to a lack of understanding of the consequences of this process on how users perceive and interact with adapted robots.


Based on the observation that both the role of the user in adaptive processes and customization are neglected in the research, we aim to fill that gap by comparing the psychological processes of customizing a robot (i.e., choosing its characteristics) versus personalizing a robot (i.e., the robots adapts itself based on the rules of its system). To do so, we proposed a theoretical model that distinguishes two psychological routes that increase positive attitudes and trust towards robots: The first route rests on personalization: As the robot appears autonomous while adapting itself to a user (and reaching personal relevance), the given user may be more prone to ascribe human characteristics to make sense of this perceived autonomy. Such psychological process is known as anthropomorphization and contributes to positive attitudes and trust in HRI. The second route rests on customization: That is, here, the user is involved in the adaptation of the characteristics of the robots. Doing so, the user experiences more self-investment, control, and knowledge during the adaptation process, thereby increasing. Relatedly, self-investment, control and perceived knowledge may trigger psychological ownership, the feeling of possessing a target. This psychological mechanism should also improve attitudes and trust toward adapted robots, mostly because we tend to positively associate a thing we own with ourselves.

Research model of ESR13 - influence personalization vs customization on trust toward robots

An attentive reader may notice that the model depicted in Figure 1 does not differentiate between customization and personalization in terms of their influence on attitudes and trust toward robots. However, anthropomorphization and psychological ownership elicit different consequences: For instance, anthropomorphizing entities makes individuals more prone to deem those entities as responsible for their consequences. Psychological ownership of an entity tends to result in feelings of responsibility toward the entity. Understanding the psychological mechanisms involved in different kinds of adaptation processes could then help to cope with the “side effects” associated with them in HRI, such as the ascription of responsibility to oneself or the robot when a mistake occurs. Moreover, it is plausible that individuals may prefer one process (i.e., personalization) over the other (i.e., customization). For instance, previous research has suggested that people with high concerns for privacy and a strong affinity for technology prefer customization over personalization. Such dispositional characteristics thus influence the extent of the effects of personalization or customization on attitudes toward robots. In any case, it seems essential to take the role of the user into account when theorizing about adaptation of social robots. Likewise, it seems essential to theorize about the role of the user in adaptation. Integrating both aspects will provide fruitful avenues for future research. 

Leave a Reply