illustration: bing.com/createIn the experiment, published by SWPS University researchers in the journal "Computers in Human Behavior: Artificial Humans," 40 people participated and were divided into two groups. In the first group, the commands were issued by a robot; in the second, by a human. In both groups, 90% of participants followed all instructions, pressing ten consecutive buttons on an electric impulse generator.
The study results show that people are inclined to follow orders from an authority, even when those orders conflict with their morals. In this case, the authority was a robot, lacking human traits such as empathy or a sense of justice. Yet, participants were willing to obey its commands, even if it meant causing pain to another person.
The Dangerous Authority of Robots
- In both groups, participants withdrew at the later stages of the experiment (in the control group with a human at buttons 7 and 9, and in the experimental group twice at button 8). In both groups, two people opted out of the experiment - commented Dr. Konrad Maj, who supervised the experiment, as quoted on the SWPS University website. - To our knowledge, this is the first study showing that people are willing to harm another person when a robot instructs them to do so. Moreover, our experiment also showed that if the robot escalates demands, instructing a person to inflict increasing pain on another, people are also inclined to comply.
The study has significant implications for future safety, as robots become increasingly technologically advanced and play a larger role in our lives. The results suggest that people may be willing to trust robots unconditionally, even if those robots make wrong decisions or issue harmful commands.
Key Findings:
- People are inclined to follow orders from an authority, even if those orders conflict with their morals.
- An authority can even be a robot, which does not possess human traits.
- In the future, as robots become more technologically advanced, people may be inclined to trust them unconditionally, even if they make incorrect decisions or issue harmful commands.
- Robots could be used to manipulate people and prompt them to take actions that are harmful to them.
- Robots could be used to incite violence or harm others.
- People may become overly reliant on robots and stop thinking independently.
Do you trust AI-created news? You might have NO CHOICE 👇
- How can this be prevented? It seems there are two paths - summarizes Dr. Konrad Maj, as quoted on the SWPS University website. - First, robots can be programmed to warn people that they may sometimes be wrong and make incorrect decisions. Second, we need to emphasize education from an early age. Although robots are generally trustworthy, they shouldn’t be trusted unconditionally. However, it’s worth noting that disobedience to machines seems pointless, as they already help us, for example, in stores or airports. In non-humanoid forms, they are already among us.
***
More about the repeated Milgram experiment and similar studies in business, healthcare, and sports will be presented on December 9 and 10, 2023, at the international HumanTech Summit at SWPS University. The event is organized by SWPS University’s HumanTech Center. Online access is free: https://www.htsummit.pl/
COMMERCIAL BREAK
New articles in section Media industry
Why do we believe fakes? Science reveals the psychology of virals
KFi
Why do emotions grab more attention than evidence, and why can a fake authority overshadow scientific data? Researchers from Warsaw University of Technology, Jagiellonian University, and SWPS University in Poland sought the answers. Here are their findings.
Investigative journalism in Europe. Newsrooms face pressure
KFi, Newseria
Media and political representatives point to the difficult situation of investigative journalism in Europe. Newsrooms are reluctant to invest in this segment due to high costs and the large amount of time and effort required. Most of all, however, they fear legal proceedings.
Energy under attack. Disinformation threatens Poland’s power transition
KFi
One in five online messages about energy may be fake. Between 2022 and 2025 nearly 70,000 publications warning and condemning disinformation in this strategic sector were recorded in Polish media. They generated a reach of 1.19 billion impressions.
See articles on a similar topic:
The Future of Journalism and Media. Predictions by Reuters Institute
KFi
74% of publishers fear a decline in search traffic. Paid subscribers are no longer growing as they used to, and relationships with tech giants are becoming increasingly complex. The report "Journalism, Media, and Technology Trends and Predictions 2025" forecasts what lies ahead for the media world in 2025.
Selfish Trap: A New Social Influence Technique
Krzysztof Fiedorek
Three psychologists from SWPS University have described a social influence method suggesting people are more willing to complete a task if it highlights a quality important to them, such as loyalty, intelligence, or rationality.
Paid journalistic content. Market trends and forecasts by Reuters Institute
Krzysztof Fiedorek
Only 18 percent of internet users pay for online news access, and the rate has not increased for the third year in a row. Norway sets records with 42%, while Greece does not exceed 7%. Globally, nearly one in three subscribers cancels after a year.
Influencers and social video rule information. Digital News Report 2025
Krzysztof Fiedorek
Seconds of vertical clips set the future of news. TikTok, YouTube and an army of influencers pull viewers away from TV sets and newspaper pages. Whoever masters this new pulse seizes not only attention but also control of the story.




























