illustration: bing.com/createIn the experiment, published by SWPS University researchers in the journal "Computers in Human Behavior: Artificial Humans," 40 people participated and were divided into two groups. In the first group, the commands were issued by a robot; in the second, by a human. In both groups, 90% of participants followed all instructions, pressing ten consecutive buttons on an electric impulse generator.
The study results show that people are inclined to follow orders from an authority, even when those orders conflict with their morals. In this case, the authority was a robot, lacking human traits such as empathy or a sense of justice. Yet, participants were willing to obey its commands, even if it meant causing pain to another person.
The Dangerous Authority of Robots
- In both groups, participants withdrew at the later stages of the experiment (in the control group with a human at buttons 7 and 9, and in the experimental group twice at button 8). In both groups, two people opted out of the experiment - commented Dr. Konrad Maj, who supervised the experiment, as quoted on the SWPS University website. - To our knowledge, this is the first study showing that people are willing to harm another person when a robot instructs them to do so. Moreover, our experiment also showed that if the robot escalates demands, instructing a person to inflict increasing pain on another, people are also inclined to comply.
The study has significant implications for future safety, as robots become increasingly technologically advanced and play a larger role in our lives. The results suggest that people may be willing to trust robots unconditionally, even if those robots make wrong decisions or issue harmful commands.
Key Findings:
- People are inclined to follow orders from an authority, even if those orders conflict with their morals.
- An authority can even be a robot, which does not possess human traits.
- In the future, as robots become more technologically advanced, people may be inclined to trust them unconditionally, even if they make incorrect decisions or issue harmful commands.
- Robots could be used to manipulate people and prompt them to take actions that are harmful to them.
- Robots could be used to incite violence or harm others.
- People may become overly reliant on robots and stop thinking independently.

- How can this be prevented? It seems there are two paths - summarizes Dr. Konrad Maj, as quoted on the SWPS University website. - First, robots can be programmed to warn people that they may sometimes be wrong and make incorrect decisions. Second, we need to emphasize education from an early age. Although robots are generally trustworthy, they shouldn’t be trusted unconditionally. However, it’s worth noting that disobedience to machines seems pointless, as they already help us, for example, in stores or airports. In non-humanoid forms, they are already among us.
***
More about the repeated Milgram experiment and similar studies in business, healthcare, and sports will be presented on December 9 and 10, 2023, at the international HumanTech Summit at SWPS University. The event is organized by SWPS University’s HumanTech Center. Online access is free: https://www.htsummit.pl/
COMMERCIAL BREAK
New articles in section Media industry
Communication gap. Is anyone listening to Polish women?
Krzysztof Fiedorek
Brands claim they understand women. Media say they speak their language. Meanwhile the report "Polki 2025" shows that most messages still miss the mark. Women do not want empty slogans. They expect a dialogue that truly relates to them.
Most medical influencer posts on TikTok are FALSE
KFi
Researchers from East Carolina University Health Medical Center analysed 120 TikTok videos tagged with hashtags such as #naturalparenting, #antivaccine, and #holistichealth. The results of their study leave no doubt.
Dead internet theory is a fact. Bots now outnumber people online
Krzysztof Fiedorek
Already 51% of global internet traffic is generated by bots, not people. As many as two-thirds of accounts on X are likely bots, and on review platforms, three out of ten reviews weren't written by a human. Do you feel something is off online? It's not paranoia. In 2025, it's a reality.
See articles on a similar topic:
Models of Journalistic Organizations
Zenon Kuczera
An overview, operational principles, and characteristics of journalistic organizations operating in Belgium, Canada, Switzerland, and the United States.
Mobile games in Poland. Market value and forecasts
Newseria, KFi
In 2030, the number of mobile game users in Poland may exceed 7.1 million, and market revenue will approach 470 million dollars, according to Statista data. As the number of gamers increases, the market for mobile gaming devices is also expanding.
Influencers Earn Too Much. No Fluff Jobs Report
KrzysztoF
According to nearly 70% of Poles, influencers earn too much, and 54% feel the least affection for them out of all professions. Only politicians receive equally low regard among respondents surveyed by No Fluff Jobs. On the other hand, nurses and… farmers are considered underpaid.
Business Communication and 25 Years of PR Evolution. ITBC Report
KFi
How has technology transformed the way companies communicate with clients? What connects speed of response, creativity, and crisis resilience? The ITBC Communication report reveals how communication has evolved over the past 25 years and what defines the future of business relationships.




























