illustration: DALL-EThe study, published in Cognition, Technology & Work, suggests that while people do follow robots` instructions to some extent, machines still fall short when it comes to authority, motivation, and efficiency. Their findings open a new discussion about the psychology of automation in the workplace.
Robots in Charge. Obedience Without Authority
According to the research team from SWPS, robots can trigger obedience, but not as reliably as human supervisors. Dr. Konrad Maj, head of the HumanTech Center for Social and Technological Innovation, explained that although humanoid machines can function in positions of authority, their psychological impact is weaker. He noted that HR departments and employers must take into account how employees perceive robots, how much they trust them, and how likely they are to resist following their commands.
The study showed that when compared to human supervisors, robots led to lower performance levels. Participants supervised by robots took more time to complete their tasks and achieved worse results. This highlights a key issue: automation in leadership roles doesn’t guarantee better outcomes unless it’s well-aligned with psychological factors.
The researchers emphasized that replacing a human boss with a machine should never be a simple switch. Whether people will accept and follow a robotic supervisor depends on cultural context, organizational norms, and ethical considerations - especially in regard to accountability for decisions made by machines.
What the Study Looked Like
The experiment took place in a lab at SWPS University and involved participants performing a basic yet repetitive task: renaming file extensions on a computer. Subjects were randomly assigned to one of two groups - one supervised by a humanoid robot named Pepper, and the other by a human researcher. If participants hesitated or stopped for more than 10 seconds, the robot or human would verbally prompt them to continue.
Results showed a significant difference in performance:
| Supervision Type | Average Time per File | Average Files Renamed |
|---|---|---|
| Human Supervisor | 23 seconds | 355 files |
| Robot Supervisor | 82 seconds | 224 files |
The human-supervised group worked faster and completed more tasks. Researchers concluded that although robots can provide instructions, they do not stimulate productivity or motivation as effectively as human managers.
These outcomes suggest that emotional and social dynamics still play a major role in workplace performance. Robots, despite their technical capabilities, may lack the human connection that encourages engagement and effort.
Between Trust and the Uncanny Valley
The interaction between humans and robots is complex. According to the SWPS team, people tend to trust machines more when they have human-like features - but only to a certain point. When a robot looks almost, but not exactly, like a person, it can trigger discomfort or even fear. This phenomenon is known as the uncanny valley.
Dr. Maj explained that this discomfort arises from conflicting reactions: a machine that looks human but behaves oddly can confuse us both cognitively and emotionally. From an evolutionary standpoint, he said, humans are wired to avoid things that resemble illness or threat. Robots that mimic people but fall short can seem "wrong" in a way that puts us on edge.
At the same time, anthropomorphism - the attribution of human traits to machines - has practical benefits. It makes robots easier to work with. A robot that talks, moves, and reacts like a person feels more intuitive. But as Dr. Maj pointed out, there`s a danger in making robots too human-like. People may start forming emotional bonds, assigning them rights, or expecting reciprocal behavior.
Do you trust AI-created news? You might have NO CHOICE 👇
He warned that as robots become more advanced, they may blur the boundaries between tool and companion. In the long term, this could change how humans relate to one another. Hyper-personalized machines, always patient and available, could make human relationships feel messy or disappointing by comparison.
Ethics, Mental Health, and the Future of Work
The psychological and ethical implications of humanoid robots go beyond the workplace. Dr. Maj and his colleagues expressed concern that people who spend significant time interacting with social robots might become more isolated. Although such technologies may help the elderly or lonely feel supported, over-reliance on machines may reduce the quality and frequency of real human contact.
Dr. Maj stressed that humans need human interaction for mental well-being. AI agents and robots may offer convenience, but they cannot replace the richness of interpersonal relationships.
He also warned that robotization of the workplace, if poorly designed, could lower job satisfaction and engagement. While people may obey robotic managers, they don’t necessarily feel motivated by them. This, in turn, can reduce productivity. The researchers argued that any implementation of robotic systems in professional settings should be participatory, backed by evidence, and tailored to the specific needs of the workforce.
They concluded that workplace robotization cannot rely solely on technical functionality. It must also reflect social and psychological realities. Without this consideration, businesses risk introducing systems that look efficient on paper - but reduce performance and morale in practice.
***
source: SWPS University
More about research:
Maj, K., Grzyb, T., Dariusz Doliński, & Franjo, M. (2025). Comparing obedience and efficiency in tedious task performance under human and humanoid robot supervision. Cognition Technology & Work.
https://doi.org/10.1007/s10111-024-00787-1
COMMERCIAL BREAK
New articles in section Media industry
How to silence fake news? Young Latinos support internet censorship
Krzysztof Fiedorek
In Brazil, a court shut down platform X, cutting off 40 million users. In Colombia, 70% of citizens want information control, and in Chile, 75% of young people support censoring fake news. Is information security replacing freedom of speech as a new trend? [STUDY]
Communication gap. Is anyone listening to Polish women?
Krzysztof Fiedorek
Brands claim they understand women. Media say they speak their language. Meanwhile the report "Polki 2025" shows that most messages still miss the mark. Women do not want empty slogans. They expect a dialogue that truly relates to them.
Most medical influencer posts on TikTok are FALSE
KFi
Researchers from East Carolina University Health Medical Center analysed 120 TikTok videos tagged with hashtags such as #naturalparenting, #antivaccine, and #holistichealth. The results of their study leave no doubt.
See articles on a similar topic:
Deepfake Blurs Truth and Falsehood. Human Perception Research
KFi
Studies indicate that only 60% of deepfake images can be correctly identified by humans. As AI begins to dominate content production, the problem of differentiation fatigue grows – users lose confidence in assessing the authenticity of information and fall into cynicism.
Influencers Earn Too Much. No Fluff Jobs Report
KrzysztoF
According to nearly 70% of Poles, influencers earn too much, and 54% feel the least affection for them out of all professions. Only politicians receive equally low regard among respondents surveyed by No Fluff Jobs. On the other hand, nurses and… farmers are considered underpaid.
Artificial Intelligence and Digital Skills. The Future of the Job Market is Here
KFi
The world faces the challenge of digital transformation, and technological skills have become a gateway to career success. How do Europeans evaluate their abilities, and which industries are leading the way? A recent report by Pracuj.pl reveals which skills open doors to better careers and why AI is the future of work.
Artificial Intelligence in the Media. Reuters Digital News Report 2024
Krzysztof Fiedorek
AI has gained prominence in recent years, and its application in producing, distributing, and presenting news content continues to grow. However, this development is met with mixed feelings by audiences, which has significant consequences for media trust and its future.




























