illustration: bing.com/createIn the experiment, published by SWPS University researchers in the journal "Computers in Human Behavior: Artificial Humans," 40 people participated and were divided into two groups. In the first group, the commands were issued by a robot; in the second, by a human. In both groups, 90% of participants followed all instructions, pressing ten consecutive buttons on an electric impulse generator.
The study results show that people are inclined to follow orders from an authority, even when those orders conflict with their morals. In this case, the authority was a robot, lacking human traits such as empathy or a sense of justice. Yet, participants were willing to obey its commands, even if it meant causing pain to another person.
The Dangerous Authority of Robots
- In both groups, participants withdrew at the later stages of the experiment (in the control group with a human at buttons 7 and 9, and in the experimental group twice at button 8). In both groups, two people opted out of the experiment - commented Dr. Konrad Maj, who supervised the experiment, as quoted on the SWPS University website. - To our knowledge, this is the first study showing that people are willing to harm another person when a robot instructs them to do so. Moreover, our experiment also showed that if the robot escalates demands, instructing a person to inflict increasing pain on another, people are also inclined to comply.
The study has significant implications for future safety, as robots become increasingly technologically advanced and play a larger role in our lives. The results suggest that people may be willing to trust robots unconditionally, even if those robots make wrong decisions or issue harmful commands.
Key Findings:
- People are inclined to follow orders from an authority, even if those orders conflict with their morals.
- An authority can even be a robot, which does not possess human traits.
- In the future, as robots become more technologically advanced, people may be inclined to trust them unconditionally, even if they make incorrect decisions or issue harmful commands.
- Robots could be used to manipulate people and prompt them to take actions that are harmful to them.
- Robots could be used to incite violence or harm others.
- People may become overly reliant on robots and stop thinking independently.

- How can this be prevented? It seems there are two paths - summarizes Dr. Konrad Maj, as quoted on the SWPS University website. - First, robots can be programmed to warn people that they may sometimes be wrong and make incorrect decisions. Second, we need to emphasize education from an early age. Although robots are generally trustworthy, they shouldn’t be trusted unconditionally. However, it’s worth noting that disobedience to machines seems pointless, as they already help us, for example, in stores or airports. In non-humanoid forms, they are already among us.
***
More about the repeated Milgram experiment and similar studies in business, healthcare, and sports will be presented on December 9 and 10, 2023, at the international HumanTech Summit at SWPS University. The event is organized by SWPS University’s HumanTech Center. Online access is free: https://www.htsummit.pl/
COMMERCIAL BREAK
New articles in section Media industry
Freelancers 2025 in media and advertising. Useme report
Krzysztof Fiedorek
The modern media and communication market presents entirely new challenges for independent creators. Traditional services are giving way to more complex forms of messaging. The most popular industries in which Polish freelancers operate focus on companies' online presence and visual content.
Video content in Poland. What and how we watch
Paweł Sobczak
Video content is watched remotely, but streaming services are mainly enjoyed in the comfort of home. This is how the consumption of audiovisual content by Poles in 2025 can be summarized. This is the result of an analysis of a study conducted by SW Research and data from the company MEGOGO.
How artificial intelligence misrepresents the news. PBC analysis
Sylwia Markowska
In news summaries generated by the most popular models in Polish, as many as 46% of responses contained at least one significant error, 27% had serious issues with sources (missing, misleading, or incorrect), and 19% contained hallucinations and outdated information.
See articles on a similar topic:
Digital media addiction. Why the brain can’t cope
KFi
Digital media can hijack the brain's reward system in ways similar to drugs and alcohol, warned psychiatrist and author Anna Lembke. She emphasized that compulsive use of digital platforms can become a serious addiction. Not just a bad habit or risky behavior.
Women in media 2025. Editorial power knows no equality
KFi
Only 27% of editors-in-chief in the media are women, even though they make up 40% of journalists. In 9 out of 12 countries studied by the Reuters Institute, women in media are less likely to get promoted. It seems that equality in newsrooms is lagging behind broader society. And the gaps go much further.
Equality and Diversity in Media: European Broadcasting Union Report
KFi
European public media are increasingly focusing on diversity, equality, and inclusion (DEI) as the foundation of their operations. Public broadcasters in Europe are implementing diversity strategies - both in content and within their teams. The findings from the report are clear: although progress is visible, many challenges remain.
Russian Propaganda. Debunk.org Report on Moscow's Disinformation Scale
BARD, PAP Mediaroom
In 2022, the Russian Federation allocated approximately 143 billion rubles to mass media (equivalent to 1.9 billion US dollars), exceeding the planned budget by 25%. For the current year, the Kremlin's budget for this sector is set at 119.2 billion rubles (1.6 billion dollars).




























