illustration: bing.com/createIn the experiment, published by SWPS University researchers in the journal "Computers in Human Behavior: Artificial Humans," 40 people participated and were divided into two groups. In the first group, the commands were issued by a robot; in the second, by a human. In both groups, 90% of participants followed all instructions, pressing ten consecutive buttons on an electric impulse generator.
The study results show that people are inclined to follow orders from an authority, even when those orders conflict with their morals. In this case, the authority was a robot, lacking human traits such as empathy or a sense of justice. Yet, participants were willing to obey its commands, even if it meant causing pain to another person.
The Dangerous Authority of Robots
- In both groups, participants withdrew at the later stages of the experiment (in the control group with a human at buttons 7 and 9, and in the experimental group twice at button 8). In both groups, two people opted out of the experiment - commented Dr. Konrad Maj, who supervised the experiment, as quoted on the SWPS University website. - To our knowledge, this is the first study showing that people are willing to harm another person when a robot instructs them to do so. Moreover, our experiment also showed that if the robot escalates demands, instructing a person to inflict increasing pain on another, people are also inclined to comply.
The study has significant implications for future safety, as robots become increasingly technologically advanced and play a larger role in our lives. The results suggest that people may be willing to trust robots unconditionally, even if those robots make wrong decisions or issue harmful commands.
Key Findings:
- People are inclined to follow orders from an authority, even if those orders conflict with their morals.
- An authority can even be a robot, which does not possess human traits.
- In the future, as robots become more technologically advanced, people may be inclined to trust them unconditionally, even if they make incorrect decisions or issue harmful commands.
- Robots could be used to manipulate people and prompt them to take actions that are harmful to them.
- Robots could be used to incite violence or harm others.
- People may become overly reliant on robots and stop thinking independently.
- How can this be prevented? It seems there are two paths - summarizes Dr. Konrad Maj, as quoted on the SWPS University website. - First, robots can be programmed to warn people that they may sometimes be wrong and make incorrect decisions. Second, we need to emphasize education from an early age. Although robots are generally trustworthy, they shouldn’t be trusted unconditionally. However, it’s worth noting that disobedience to machines seems pointless, as they already help us, for example, in stores or airports. In non-humanoid forms, they are already among us.
***
More about the repeated Milgram experiment and similar studies in business, healthcare, and sports will be presented on December 9 and 10, 2023, at the international HumanTech Summit at SWPS University. The event is organized by SWPS University’s HumanTech Center. Online access is free: https://www.htsummit.pl/
COMMERCIAL BREAK
New articles in section Media industry
Why do we believe fakes? Science reveals the psychology of virals
KFi
Why do emotions grab more attention than evidence, and why can a fake authority overshadow scientific data? Researchers from Warsaw University of Technology, Jagiellonian University, and SWPS University in Poland sought the answers. Here are their findings.
Investigative journalism in Europe. Newsrooms face pressure
KFi, Newseria
Media and political representatives point to the difficult situation of investigative journalism in Europe. Newsrooms are reluctant to invest in this segment due to high costs and the large amount of time and effort required. Most of all, however, they fear legal proceedings.
Energy under attack. Disinformation threatens Poland’s power transition
KFi
One in five online messages about energy may be fake. Between 2022 and 2025 nearly 70,000 publications warning and condemning disinformation in this strategic sector were recorded in Polish media. They generated a reach of 1.19 billion impressions.
See articles on a similar topic:
Digital Press Reading Habits
Bartłomiej Dwornik
What time of day do we most often reach for e-newspapers and e-books? According to a study by Legimi, peak times are between 6 p.m. and 11 p.m. It’s time to dismiss the notion that weekends are our favorite reading days.
Poles on the Internet. RegionyNEXERY 2024 Report
KFi
The Internet not only connects people but also changes their daily habits in ways that seemed unattainable just a few years ago. Over 40% of Poles work remotely, and IoT devices are gaining popularity in rural areas. The #RegionyNEXERY 2024 report reveals surprising facts about the digital reality.
Fake News in Poland. Challenges in Assessing Information Credibility
RINF
One in four information consumers relies on sources where verifying credibility is a significant challenge. Fake news remains a major issue, as indicated by 77% of respondents, with 51% admitting they struggle to discern truth from falsehood, according to Deloitte's *Digital Consumer Trends 2021* report.
Cyberviolence and hate disguised as a joke. The RAYUELA report on youth
Krzysztof Fiedorek
The study conducted in five countries reveals a harsh truth. Online violence is not evenly distributed. It is a digital map of prejudice that hurts the most those who stand out the most. "It’s just a joke." That’s how violence often begins. Young people go through it in silence.





























