
The study, published in Cognition, Technology & Work, suggests that while people do follow robots` instructions to some extent, machines still fall short when it comes to authority, motivation, and efficiency. Their findings open a new discussion about the psychology of automation in the workplace.
Robots in Charge. Obedience Without Authority
According to the research team from SWPS, robots can trigger obedience, but not as reliably as human supervisors. Dr. Konrad Maj, head of the HumanTech Center for Social and Technological Innovation, explained that although humanoid machines can function in positions of authority, their psychological impact is weaker. He noted that HR departments and employers must take into account how employees perceive robots, how much they trust them, and how likely they are to resist following their commands.
The study showed that when compared to human supervisors, robots led to lower performance levels. Participants supervised by robots took more time to complete their tasks and achieved worse results. This highlights a key issue: automation in leadership roles doesn’t guarantee better outcomes unless it’s well-aligned with psychological factors.
The researchers emphasized that replacing a human boss with a machine should never be a simple switch. Whether people will accept and follow a robotic supervisor depends on cultural context, organizational norms, and ethical considerations - especially in regard to accountability for decisions made by machines.
What the Study Looked Like
The experiment took place in a lab at SWPS University and involved participants performing a basic yet repetitive task: renaming file extensions on a computer. Subjects were randomly assigned to one of two groups - one supervised by a humanoid robot named Pepper, and the other by a human researcher. If participants hesitated or stopped for more than 10 seconds, the robot or human would verbally prompt them to continue.
Results showed a significant difference in performance:
Supervision Type | Average Time per File | Average Files Renamed |
---|---|---|
Human Supervisor | 23 seconds | 355 files |
Robot Supervisor | 82 seconds | 224 files |
The human-supervised group worked faster and completed more tasks. Researchers concluded that although robots can provide instructions, they do not stimulate productivity or motivation as effectively as human managers.
These outcomes suggest that emotional and social dynamics still play a major role in workplace performance. Robots, despite their technical capabilities, may lack the human connection that encourages engagement and effort.
Between Trust and the Uncanny Valley
The interaction between humans and robots is complex. According to the SWPS team, people tend to trust machines more when they have human-like features - but only to a certain point. When a robot looks almost, but not exactly, like a person, it can trigger discomfort or even fear. This phenomenon is known as the uncanny valley.
Dr. Maj explained that this discomfort arises from conflicting reactions: a machine that looks human but behaves oddly can confuse us both cognitively and emotionally. From an evolutionary standpoint, he said, humans are wired to avoid things that resemble illness or threat. Robots that mimic people but fall short can seem "wrong" in a way that puts us on edge.
At the same time, anthropomorphism - the attribution of human traits to machines - has practical benefits. It makes robots easier to work with. A robot that talks, moves, and reacts like a person feels more intuitive. But as Dr. Maj pointed out, there`s a danger in making robots too human-like. People may start forming emotional bonds, assigning them rights, or expecting reciprocal behavior.
He warned that as robots become more advanced, they may blur the boundaries between tool and companion. In the long term, this could change how humans relate to one another. Hyper-personalized machines, always patient and available, could make human relationships feel messy or disappointing by comparison.
Ethics, Mental Health, and the Future of Work
The psychological and ethical implications of humanoid robots go beyond the workplace. Dr. Maj and his colleagues expressed concern that people who spend significant time interacting with social robots might become more isolated. Although such technologies may help the elderly or lonely feel supported, over-reliance on machines may reduce the quality and frequency of real human contact.
Dr. Maj stressed that humans need human interaction for mental well-being. AI agents and robots may offer convenience, but they cannot replace the richness of interpersonal relationships.
He also warned that robotization of the workplace, if poorly designed, could lower job satisfaction and engagement. While people may obey robotic managers, they don’t necessarily feel motivated by them. This, in turn, can reduce productivity. The researchers argued that any implementation of robotic systems in professional settings should be participatory, backed by evidence, and tailored to the specific needs of the workforce.
They concluded that workplace robotization cannot rely solely on technical functionality. It must also reflect social and psychological realities. Without this consideration, businesses risk introducing systems that look efficient on paper - but reduce performance and morale in practice.
***
source: SWPS University
More about research:
Maj, K., Grzyb, T., Dariusz Doliński, & Franjo, M. (2025). Comparing obedience and efficiency in tedious task performance under human and humanoid robot supervision. Cognition Technology & Work.
https://doi.org/10.1007/s10111-024-00787-1
COMMERCIAL BREAK
New articles in section Media industry
Dead internet theory is a fact. Bots now outnumber people online
Krzysztof Fiedorek
Already 51% of global internet traffic is generated by bots, not people. As many as two-thirds of accounts on X are likely bots, and on review platforms, three out of ten reviews weren't written by a human. Do you feel something is off online? It's not paranoia. In 2025, it's a reality.
The most valuable female personal brands in Polish fashion. IMM report
KFi
The ten most popular people in Poland in the "fashion" category generate over 1.5 billion contacts across all types of media in a year. Their value is nearly 400 million zlotys. The ranking was prepared by the Institute of Media Monitoring for "Forbes Women" magazine.
Disinformation ranks above terrorism as global threat
KFi
According to "International Opinion on Global Threats" by Pew Research Center, a median of 72% of adults across 25 countries view the spread of false information online as a major threat to their country. That number places disinformation at the very top of perceived global dangers.
See articles on a similar topic:
Influencers Earn Too Much. No Fluff Jobs Report
KrzysztoF
According to nearly 70% of Poles, influencers earn too much, and 54% feel the least affection for them out of all professions. Only politicians receive equally low regard among respondents surveyed by No Fluff Jobs. On the other hand, nurses and… farmers are considered underpaid.
Future of Public Media. Who Will Be Data Ethicists and VR Designers?
KFi
How does the future of work in media look? Here are professions that do not yet exist but will soon become essential. The report "Future Jobs at PSM: Competencies and Professions for the Media of Tomorrow," prepared by the European Broadcasting Union (EBU) and Rai Ufficio Studi, outlines key changes awaiting the public media sector in the coming years.
Hate speech is contagious and leads to harm [EXPERT OPINION]
Karolina Kropiwiec
‘If we are in an environment where certain groups of people are insulted, there is a high probability that we will start using such language ourselves; hate speech is contagious and its consequence is someone's harm,’ says Dr. Michał Bilewicz from the Centre for Research on Prejudice at the University of Warsaw.
Milgram Experiment 2023. AI Can Encourage Violence
KrzysztoF
Researchers from SWPS University replicated the famous Milgram experiment, in which participants were instructed to inflict pain on another person under the authority’s command. This time, the authority was a robot. It’s the first study showing that people are willing to harm another person when a robot commands them to do so.