illustration: DALL-EA fake image of comet 3I/ATLAS once again proved how easily the internet can turn a simple astronomical observation into a sci-fi story. This viral shows the mechanism in pure form: the more prestigious the supposed source appears, the faster belief in the attractive fiction spreads.
Emotions over evidence. How a viral begins
Researchers from Warsaw University of Technology emphasize in the "Analysis of polarization and inoculation mechanisms (PW/UJ/SWPS)" that algorithms favor bold and flashy content. Images from Hubble and ESA show an ordinary comet: a blurry coma and a tail reflecting light. But they lack that visual "hook" that gives meme potential. A viral acts like a lens — it magnifies what is easy to remember, even if false.
To show the difference between an attractive fake and a real observation, it’s worth comparing two images from the "Analysis...":
| Image type | Visual features | Reception effect |
|---|---|---|
| Authentic telescope photos | Blurry coma, dust tail, no sharp edges | Lower "clickability" |
| Viral images | Sharpened contours, added "metallic" elements | Increased engagement and false interpretations |
So, we see that manipulation doesn`t have to be sophisticated. Just a few visual tricks are enough for viewers to feel they’re seeing "something extraordinary."
Bubbles, bridges, and leaps. How opinions spread
The model described by researchers from PW in the "Analysis of polarization and inoculation mechanisms" works like a map of social dynamics. It shows two camps connected by bridges — channels of real influence. When bridges are weak, groups stay in their own worlds. When they become too strong, a leap occurs: the entire network adopts a sharp, unified message, regardless of its truth.
This pattern is well illustrated by examples from public health disputes. If one side dominates communication, the other responds with pushback. This leads to waves of trends and countertrends.
What fuels polarization:
- aggressive campaigns targeting only one group,
- poor-quality bridges (lack of trusted intermediaries),
- emotional asymmetry — one side admires, the other rejects,
- lack of preventive education about manipulation.
Researchers show that it’s not the presence of different opinions that raises tensions, but how groups are connected.
Inoculation. How psychological "vaccines" work
Part of the report by UJ describes the inoculation mechanism: a brief training that warns recipients about typical manipulation tricks. The analysis includes exercises that reduce the effect of first impressions and help distinguish falsehood from interpretation.
A real example of such a "vaccine" is the campaign for seniors "Grandma, Grandpa… don’t be fooled." It warns about tricks before a scam appears. This simple technique aligns with the report’s findings: reactions are stronger when the recipient knows the pattern.
SWPS shows another method: an educational game where users play the role of the manipulator. After this experience, people become better at spotting fakes. Results from the study suggest, however, that repeated exposure is needed for a lasting effect. Meanwhile, the SWPS and SGGW team studies awareness of propaganda sources. This method only works in certain conditions. Results don’t automatically repeat, showing how complex social influence processes really are.
COMMERCIAL BREAK
New articles in section Media industry
Journalism in the age of AI. Why people prefer humans over machines
Krzysztof Fiedorek
Only 12% of people accept news created solely by AI, while 62% prefer those written by humans. At the same time, only 19% notice labels indicating the use of artificial intelligence, while younger audiences ask AI to explain the content to them. These are the findings of the Reuters Institute report on artificial intelligence in media.
Investigative journalism in Europe. Newsrooms face pressure
KFi, Newseria
Media and political representatives point to the difficult situation of investigative journalism in Europe. Newsrooms are reluctant to invest in this segment due to high costs and the large amount of time and effort required. Most of all, however, they fear legal proceedings.
Energy under attack. Disinformation threatens Poland’s power transition
KFi
One in five online messages about energy may be fake. Between 2022 and 2025 nearly 70,000 publications warning and condemning disinformation in this strategic sector were recorded in Polish media. They generated a reach of 1.19 billion impressions.
See articles on a similar topic:
Violence in Media and Child Rearing
Małgorzata Więczkowska
The influence of mass media on individuals is now an undisputed fact. There is no place today where this impact on religious, moral, political, social, or educational attitudes cannot be felt.
Information bubbles. Study of Instagram, Tik Tok and You Tube users
Urszula Kaczorowska
A staggering 96 percent of the time people spend online is spent on anything but consuming information. This, says Professor Magdalena Wojcieszak means ‘we have over-inflated the issue of information bubbles and disinformation.’
Automation of Disinformation. Global Risks Report 2025 and Media
Krzysztof Fiedorek
Disinformation and information manipulation have ranked first among global threats in both the two-year and ten-year perspectives. A particularly concerning factor is that social media algorithms often favor controversial or shocking content, further fueling disinformation.
AI changes the game. A new face of internet search
KFi
Half of consumers in the US already use AI-powered search. By 2028, purchase decisions worth $750 billion will be made through AI. These findings come from McKinsey’s report "Winning in the age of AI search".




























