illustration: bing.com/createIn the experiment, published by SWPS University researchers in the journal "Computers in Human Behavior: Artificial Humans," 40 people participated and were divided into two groups. In the first group, the commands were issued by a robot; in the second, by a human. In both groups, 90% of participants followed all instructions, pressing ten consecutive buttons on an electric impulse generator.
The study results show that people are inclined to follow orders from an authority, even when those orders conflict with their morals. In this case, the authority was a robot, lacking human traits such as empathy or a sense of justice. Yet, participants were willing to obey its commands, even if it meant causing pain to another person.
The Dangerous Authority of Robots
- In both groups, participants withdrew at the later stages of the experiment (in the control group with a human at buttons 7 and 9, and in the experimental group twice at button 8). In both groups, two people opted out of the experiment - commented Dr. Konrad Maj, who supervised the experiment, as quoted on the SWPS University website. - To our knowledge, this is the first study showing that people are willing to harm another person when a robot instructs them to do so. Moreover, our experiment also showed that if the robot escalates demands, instructing a person to inflict increasing pain on another, people are also inclined to comply.
The study has significant implications for future safety, as robots become increasingly technologically advanced and play a larger role in our lives. The results suggest that people may be willing to trust robots unconditionally, even if those robots make wrong decisions or issue harmful commands.
Key Findings:
- People are inclined to follow orders from an authority, even if those orders conflict with their morals.
- An authority can even be a robot, which does not possess human traits.
- In the future, as robots become more technologically advanced, people may be inclined to trust them unconditionally, even if they make incorrect decisions or issue harmful commands.
- Robots could be used to manipulate people and prompt them to take actions that are harmful to them.
- Robots could be used to incite violence or harm others.
- People may become overly reliant on robots and stop thinking independently.
- How can this be prevented? It seems there are two paths - summarizes Dr. Konrad Maj, as quoted on the SWPS University website. - First, robots can be programmed to warn people that they may sometimes be wrong and make incorrect decisions. Second, we need to emphasize education from an early age. Although robots are generally trustworthy, they shouldn’t be trusted unconditionally. However, it’s worth noting that disobedience to machines seems pointless, as they already help us, for example, in stores or airports. In non-humanoid forms, they are already among us.
***
More about the repeated Milgram experiment and similar studies in business, healthcare, and sports will be presented on December 9 and 10, 2023, at the international HumanTech Summit at SWPS University. The event is organized by SWPS University’s HumanTech Center. Online access is free: https://www.htsummit.pl/
COMMERCIAL BREAK
New articles in section Media industry
Advertising market 2025. Poland, Europe and the World
Marcin Grządka
The global advertising market is growing by 8.8% in 2025 and will reach a value of 1.14 trillion dollars. The industry result in Europe records slightly lower dynamics, at the level of 5.8%. In this comparison, Poland performs clearly above the average. We will record an increase of 8.9% this year and a value of 18.56 billion PLN - estimates WPP Media in the annual report "This Year Next Year".
The print media market 2025. Three global trends
Krzysztof Fiedorek
The market value is 359.53 billion dollars, yet the erosion is visible to the naked eye. The decline for newspapers will amount to -2.3 percent. Despite this, print retains strength: it generates 76 percent of subscription revenues and enjoys 82 percent consumer trust. The future of the industry is defined by hybrid strategies and niche specialization.
Journalism in the age of AI. Why people prefer humans over machines
Krzysztof Fiedorek
Only 12% of people accept news created solely by AI, while 62% prefer those written by humans. At the same time, only 19% notice labels indicating the use of artificial intelligence, while younger audiences ask AI to explain the content to them. These are the findings of the Reuters Institute report on artificial intelligence in media.
See articles on a similar topic:
Trust in Public Media in Europe. Report by European Broadcasting Union
Krzysztof Fiedorek
Public media in Europe play a significant role in fostering trust and supporting democracy. The EBU 2024 report examines leaders, major challenges, and the impact of media on society. Polish, Hungarian, and Greek media, with results far below the average, face a crucial question: can trust be rebuilt?
How the Media Talk (or Stay Silent) About Climate. Reuters Institute Report
Krzysztof Fiedorek
Although climate change is becoming increasingly noticeable worldwide, the media have failed to maintain growing interest in the topic. The report "Climate Change and News Audiences 2024" shows that audience engagement with climate topics has remained almost unchanged for several years.
Information bubbles. Study of Instagram, Tik Tok and You Tube users
Urszula Kaczorowska
A staggering 96 percent of the time people spend online is spent on anything but consuming information. This, says Professor Magdalena Wojcieszak means ‘we have over-inflated the issue of information bubbles and disinformation.’
Radio Listenership in Poland 2024: Demographics, Trends, and Statistics
Krzysztof Fiedorek
Why do millions of Poles still choose radio? What drives RMF FM's dominance and Eska's surprising results? The latest "Audio Track" report from the National Media Institute reveals listenership data, demographics, and evolving trends. How does the digital revolution affect traditional stations?




























