illustration: Bing AIShe reports that as many as 25 percent of girls in the 18-24 age group follow virtual influencers. In the 55+ group, on the other hand, most people do not even know that this phenomenon exists.
Most virtual influencers have been created for marketing purposes; their purpose is to earn money. However, we must not forget about their other, more or less hidden goals, e.g. political or social ones. After all, they perform opinion-forming functions, as the name `influencer` suggests.
The first virtual influencer, Lil Miquela, was created in 2016. She - and other such characters - `run` social media profiles, `record` albums that are available on popular platforms, and `take part` in advertisements for real brands or promotions of places. Whole teams of people are behind these activities - from programmers and graphic designers to storytellers.
- Thanks to the development of generative artificial intelligence, we increasingly interact with characters with an unclear ontological status. After all, for many people, virtual characters are alive, despite the lack of a biological body: they exist in some world, they are active, they earn money, they interact, they can answer questions. This creates an emotional bond that is enough to influence a person - the researcher says.
By design, virtual influencers are supposed to be imperfect like people. `They have all the physical attributes that human influencers try to avoid: freckles, smudged makeup, hair sticking out in the morning, sweat on their forehead. In addition, they do not hide that they are digital characters. This means that they are perceived as authentic by Gen Alpha - even more so than those real influencers who often use various types of tools to improve their appearance and strive for the ideal`, she says.
She added that this is where a deeper problem begins. `Authenticity is separated from whether something is real or not. It turns out that it can be attributed to something that does not exist. We still have a division between fiction and reality, because the offline-online division has long since collapsed. Now, the division between the living and non-living is also beginning to blur. That is why future generations will certainly face the problem of distinguishing truth from falsehood, reality from illusion`, she says.
The problem will not disappear, on the contrary - the development of technology and artificial intelligence is inevitable. Moreover, with the arrival of 2025, the next generation- Beta - will be born, even more immersed in the digital world and in a sense addicted to the dose of emotions that is associated with it.
According to Dr. Pawlak, one of the advantages of virtual influencers is that they accept their recipients as they are. `In the right thumb (like) culture in which we live, the possibility of not being judged seems like a dream. The creators of artificial intelligence know this very well. That is why they create virtual influencers`, she says.
She adds that contact with virtual influencers raises questions: how does the development of technology affect intimacy? Will new forms of parasocial relationships (with chatbots, virtual influencers) be the answer to loneliness and the need for closeness?
Do you trust AI-created news? You might have NO CHOICE 👇
Dr. Pawlak also mentions a fairly new, developing trend, in which virtual influencers are used to `pull` people out of the online world, e.g. by connecting users who will go out for dinner together, or by promoting real places worth visiting.
`I compare the achievements of artificial intelligence to fast food. If I eat a hamburger every once in a while, I know that nothing will happen to me. But if I make it a habit, then after a short while my psychological and physical well-being will deteriorate. There are already scientific studies that have shown that lonely people who spend too much time with a synthetic companion begin to anthropomorphize this character at some point, and really treat it as a human being`, she says.
She adds that technology itself is neither bad nor good, nor even neutral.
- It all depends on the usage - on how we will use these tools, as well as on our entire emotional and social background, the protective digital suit - she says.
source: PAP - Science in Poland
COMMERCIAL BREAK
New articles in section Media industry
Journalism in the age of AI. Why people prefer humans over machines
Krzysztof Fiedorek
Only 12% of people accept news created solely by AI, while 62% prefer those written by humans. At the same time, only 19% notice labels indicating the use of artificial intelligence, while younger audiences ask AI to explain the content to them. These are the findings of the Reuters Institute report on artificial intelligence in media.
Why do we believe fakes? Science reveals the psychology of virals
KFi
Why do emotions grab more attention than evidence, and why can a fake authority overshadow scientific data? Researchers from Warsaw University of Technology, Jagiellonian University, and SWPS University in Poland sought the answers. Here are their findings.
Investigative journalism in Europe. Newsrooms face pressure
KFi, Newseria
Media and political representatives point to the difficult situation of investigative journalism in Europe. Newsrooms are reluctant to invest in this segment due to high costs and the large amount of time and effort required. Most of all, however, they fear legal proceedings.
See articles on a similar topic:
Automation of Disinformation. Global Risks Report 2025 and Media
Krzysztof Fiedorek
Disinformation and information manipulation have ranked first among global threats in both the two-year and ten-year perspectives. A particularly concerning factor is that social media algorithms often favor controversial or shocking content, further fueling disinformation.
Digital media addiction. Why the brain can’t cope
KFi
Digital media can hijack the brain's reward system in ways similar to drugs and alcohol, warned psychiatrist and author Anna Lembke. She emphasized that compulsive use of digital platforms can become a serious addiction. Not just a bad habit or risky behavior.
Milgram Experiment 2023. AI Can Encourage Violence
KrzysztoF
Researchers from SWPS University replicated the famous Milgram experiment, in which participants were instructed to inflict pain on another person under the authority’s command. This time, the authority was a robot. It’s the first study showing that people are willing to harm another person when a robot commands them to do so.
Safari Surpasses Opera. A New Shift in the Browser Market in Poland
Krzysztof Fiedorek
In the summer of 2024, a historic event occurred in Poland's browser market. In July and August, Safari surpassed Opera on all devices for the first time. Data from the StatCounter report indicates that Apple's browser maintains a steady market share while Opera is gradually but noticeably losing ground.




























