
The development of technology and the increasing importance of social media are transforming the way we consume information. The World Economic Forum’s "Global Risks Report 2025" indicates that as many as 5.5 billion people use the internet, with over 5 billion active on social media. In this digital environment, algorithms play an increasing role in selecting content for users.
Recommendation systems increasingly operate by reinforcing social divisions. Studies show that the more extreme the content, the higher the chance it will be shared. This leads to the creation of information bubbles and limits exposure to alternative viewpoints.
A particularly dangerous trend is the automation of disinformation. Generative AI enables the creation of fake images, videos, and voice recordings, making it easier to manipulate public opinion.
Threat | Consequences |
---|---|
Algorithms reinforcing divisions | Polarization of society |
Information bubbles | Limited access to diverse sources |
Automation of disinformation | Increase in manipulated content |
Rising Threats to Information
Disinformation and information manipulation have been identified as the top global threats in both the short and long term. Experts emphasize that the development of generative artificial intelligence (GenAI) has dramatically reduced the cost and time required to produce content.
As noted by the authors of the Global Risks Report 2025, this has resulted in a surge of manipulated materials – from fake videos and doctored voice recordings to AI-generated texts. A particularly troubling issue is that social media algorithms often favor controversial or shocking content, further exacerbating disinformation.
Media in an Era of Low Trust
Global trust in the media is at its lowest in years. In a survey covering 47 countries, only 40% of respondents stated that they trust most news sources. This is a result not only of disinformation but also of growing social and political divisions that influence how facts are perceived. In high-income countries, concerns about the effects of disinformation are more pronounced than in developing nations. In 13 countries, including Germany, India, and Canada, disinformation has been ranked among the top five threats.
Meanwhile, technology is further eroding traditional mechanisms of information verification. Artificial intelligence can generate hyper-realistic content that is difficult to distinguish from authentic materials. The result is information chaos, undermining the foundations of public debate.
Problem | Consequences |
---|---|
Spread of disinformation | Undermining trust in credible media |
Algorithms promoting controversy | Polarization of public opinion |
Generative AI in the hands of disinformers | Increase in manipulated content |
Beyond false information, censorship is also becoming a problem – both that imposed by governments and decisions made by technology corporations.
Censorship and Digital Surveillance
Governments and corporations today have greater access to user data than ever before. The digitization of administration allows authorities to amass vast amounts of information about citizens – from tax records to voter registries. The report highlights that, in many cases, private companies have better insight into user data than governments themselves.
The issue of digital surveillance and media repressions are particularly pronounced in East Asia, Latin America, and the Caribbean. In these regions, censorship and citizen monitoring are rising rapidly in the rankings of global threats. For example, in Nicaragua, digital surveillance has already become the country’s fourth-largest risk.
At the same time, technology companies are exerting increasing influence over public debate. On one hand, social media platforms remove content deemed dangerous; on the other, they often arbitrarily block information, leading to accusations of bias and violations of free speech.
Fake News as a Political Weapon
Disinformation has become an element of information warfare between nations. Election interference, economic destabilization, and public opinion manipulation are actions that can be carried out by governments and organizations.
The biggest threats associated with fake news include:
- Creating false political narratives – for example, spreading manipulated information before elections.
- Fueling social tensions – reinforcing conflicts between different social groups.
- Reputation attacks – campaigns aimed at discrediting public figures.
The World Economic Forum suggests that combating disinformation requires better labeling of AI-generated content and greater transparency regarding the algorithms used by social media platforms.
Education as the Key to Fighting Disinformation
Experts indicate that one of the most effective ways to counter disinformation is digital education. Citizens should have a better understanding of how social media works, how information manipulation occurs, and how to protect their data.
The most important areas of education include:
- Identifying fake news – learning critical thinking and source verification.
- Awareness of algorithms – understanding how social media platforms shape information.
- Privacy protection – knowledge of security settings and encryption methods.
Without improving public awareness, the digital information ecosystem may become even more chaotic, making disinformation increasingly difficult to detect.
The full Global Risks Report 2025 is available at:
https://www.marshmclennan.com/insights/publications/2025/january/global-risks-report.html
COMMERCIAL BREAK
New articles in section Media industry
YouTube redefines viewer engagement. Goodbye to returning viewers
KFi
As many as 30% of internet users now turn to YouTube as their main news source, and 65% consume news in video form. Now the platform is shaking things up. Reach still matters, but engagement is what really counts.
Influencers and social video rule information. Digital News Report 2025
Krzysztof Fiedorek
Seconds of vertical clips set the future of news. TikTok, YouTube and an army of influencers pull viewers away from TV sets and newspaper pages. Whoever masters this new pulse seizes not only attention but also control of the story.
Cyberviolence and hate disguised as a joke. The RAYUELA report on youth
Krzysztof Fiedorek
The study conducted in five countries reveals a harsh truth. Online violence is not evenly distributed. It is a digital map of prejudice that hurts the most those who stand out the most. "It’s just a joke." That’s how violence often begins. Young people go through it in silence.
See articles on a similar topic:
Media Subscriptions to Replace Advertising. TMT Predictions 2018
BARD
Some publishers already consider attempts to generate revenue from online advertising a waste of time. According to the "TMT Predictions 2018" report by Deloitte, by the end of 2018, half of all adults in developed countries will have at least two online media subscriptions.
Digital Press Reading Habits
Bartłomiej Dwornik
What time of day do we most often reach for e-newspapers and e-books? According to a study by Legimi, peak times are between 6 p.m. and 11 p.m. It’s time to dismiss the notion that weekends are our favorite reading days.
Streaming Services. Rapid Growth of Subscribers in Poland
RINF
Compared to 2020, 20% more Poles declared having a subscription to video streaming services in 2021, according to the *Digital Consumer Trends 2021* report published by Deloitte.
Radio, Streaming, and Podcasts. Total Audio 2024 Report about Poland
Krzysztof Fiedorek
Audio content is a daily companion for Poles. According to the Total Audio 2024 study conducted by Adres:Media on behalf of the Radio Research Committee, as many as 90% of respondents listen to audio content at least once a week, and 80% do so daily. The average listening time is nearly five hours per day.