Business

Tiktok ready for elections. “Politicians have turned off functions to promote content and advertising” [WYWIAD]


Grzegorz Kubera, Business Insider Polska: We are ahead of president elections in Poland. Can citizens using the thicketdo encounter false information on the platform at that time?

Łukasz Gabler, Public Policy and Government Relations Manager CEE, Tiktok: Tiktok is primarily an entertainment platform, not an informational platform, but the presence of politicians and topics related to elections are noticeable on it. Our community actively participates in discussions about voting and election processes, which is why we realize that these are issues important for users and generating discussion.

For this reason, before each election, we launch activities aimed at securing the election process on our platform. This is a standard procedure that we implement globally – so far we have already covered over 200 election processes around the world. In Europe itself this year we conducted such activities within at least four election campaigns.

And what are these activities?

They focus on three key areas. The first is moderation and security of content – here we use both a team of moderators and technology that automatically detects and removes materials that are incompatible with our principles, including disinformation.

The second area is cooperation with experts. We cooperate with organizations dealing with Fact-Checking, as well as with institutional partners, such as government agencies or institutes to monitor content in the network that inform us about potential violations, but also warn us against just growing phenomena and advise on how to deal with them.

The third area, which we treat no less priority, is user education. Most reports and analyzes regarding disinformation indicate that society education in critical content analysis is required. That is why we launch information campaigns in which we explain, for example, how the election process works and direct users to proven sources, such as the side of the National Electoral Commission. We also conduct educational activities in cooperation with local partners – in Poland this year they include Demagog, Eurozet Group and the “Orient” project. We create educational films together that help users recognize disinformation and distinguish reliable materials from false.

Do these actions detect and delete false information quickly enough? It should not be that, for example, a given content will be displayed hundreds of thousands of times before it is recognized as false and directed to be removed.

When it comes to the effectiveness of fact-checking, we are aware that response time is crucial-especially just before the election. That is why we choose organizations for cooperation that can efficiently operate on large platforms and understand the specifics of the thicket. We sign contracts only with certified fact-checking organizations with the IFCN (International Fact-Checking Network) certificate, which guarantees the quality and independence of these activities.

We are also supported by experts from external security advisory councils, which gives us additional flexibility and the ability to quickly respond to applications and reports appearing on the platform.

We will ask the specifics. Does the detection of false content take, for example, two hours from the moment of publication?

We have recently published a report on our activities in the field of removing content that violates the integrity of the election process, including disinformation. As much as 99 percent Such materials are removed proactively, even before anyone can report them – so we are talking about effective prevention. In situations where the moderation system does not catch the content at the preliminary stage, nearly 90 percent. From these materials is removed within 24 hours. This is a very high reaction effectiveness indicator in a short time.

What about ads? Is Fact-Checking here too? I read that last year Tiktok had a problem with Ireland and managed to allow the emission of ads related to elections, in addition containing false information.

Tiktok does not allow for posting political advertising, which means that these types of functions in the platform are not available to politicians or other participants of public life. These types of activities are simply not carried out on our platform. Moderation of advertising materials is carried out in a separate, strictly regulated process, independent of the moderation of organic content. Teams responsible for advertising operate not only based on the principles of community, but also according to internal advertising policies, which precisely determine what content can be promoted as part of a paid promotion.

In other words: such ads should never hit the thicket and at least from a technical point of view it is impossible to post political ads on a thicket?

Correct. You cannot post or buy political advertising on the thicket. This also results in a special policy towards accounts belonging to politicians, party or other participants of public life. Such accounts have turned off functions enabling the promotion of content, establishing cooperation or buying advertisements – these options are simply unavailable to them.

Tiktok allows you to express opinions and freedom of speech. This raises challenges

Let's get to the issue of publishing films expressing the user's opinion. Personally, I often meet with movies on the thicket, which simply show untruth. I will give an example: I was looking for tips on positioning on Google (SEO). I have been dealing with this for over 10 years, I have many years of experience and I am up to date. I easily came across advice with hundreds of thousands of views, which were simply sucked out of the finger. I heard similar opinions from specialists from other industries – they also quite easily come across the content developed by people who are not practitioners, and yet they argue that they know something and create false information. Is there a proverbial free American?

The principles of the community clearly determine what content is allowed, and which are not – and they apply to all users. The materials published on the platform undergo several stages of verification. They are already checked at the transmission stage, and later also in a reactive manner – especially when interest is growing around a given film or when we receive applications from users or partner institutions with which we cooperate.

However, content verification can be difficult, especially when we are dealing with opinions, not hard facts. That is why cooperation with Fact-Checking organizations, which support us in making decisions on possible blocking of materials, remains an important element of our activities. However, it must be honestly admitted that there are situations in which even experts are not able to clearly assess the truthfulness of the information presented – for example, due to the lack of reliable sources or because the topic is too fresh and the public has not yet managed to develop an unambiguous position.

And what happens then?

In such cases, we use additional mechanisms limiting the range of the material. Movies for which we have doubts do not go to the page “For You”, which is the main channel of the recommendation on the thicket. We also provide them with special labels informing that the content was subjected to moderation, but it was not possible to clearly assess its credibility.

Users who want to provide such material further and receive a warning to be careful. In this way, we try not only to limit the spread of potentially misleading content, but also to encourage the community to critical thinking and careful assessment of what they are watching.

As for Fact-Checking, I understand that you do not check such content as SEO, from journalism, or creating graphics and animations, and many other areas in terms of substance? Or is I wrong?

Fact-checking organizations are able to verify content to various extent-it is not that they act only as part of very narrow specializations or rigid contracts. Their activities are flexible and adapted to the nature of the materials. It is worth remembering, however, that not all the content we are talking about can be clearly qualified as false, harmful or unfounded.

For many users, they also have a specific value – they can be a source of information and knowledge, but also inspiration and even entertainment. The assessment of such content depends largely on their context and the method of reception by various groups of recipients.

Tiktok in Poland is safe. Regulations protect against sending data to China

In recent months, Tiktok has been often criticized by the media, especially foreign. As the owner is the Chinese company, there are fears that the data goes to China and this way the communist party and the organizations associated with it “learn” of people from other countries, especially the USA. Quite recently, the Data Protection Commission also imposed 530 million euros for insufficient data protection and their transfer to China. For many people, Tiktok has a rather negative reputation. What is your position in this matter?

As a European Tiktoka branch, we usually do not comment on events or decisions taken in the United States, if only because we operate in two different regulatory systems. In Europe, the functioning of platforms like ours is strictly defined by EU regulations, which ensures stability and predictability of conducting business. Which of course does not mean that in the US we approach these issues less rigorously.

From a global perspective, however, it is worth noting that more and more people are seeing the international character of Tiktok – also in the context of the ongoing debate in the USA. Our company was created thanks to the involvement of investors from various parts of the world, including from the USA, which we have been emphasizing for years, pointing to our multinational structure and manner of action.

When it comes to data protection issues, in Europe since 2023 there is already a fully functioning system for securing data of Tiktok users, which we called Project Clover. This is a pioneering approach throughout the industry, which will consume a total of EUR 12 billion. It is based on three main elements. First of all, European users are stored locally – in data centers located in Ireland and Norway, which limits their transmission outside Europe.

Secondly, we have implemented internal data protection policies, which we watch with the utmost care. Thirdly, the whole is supervised by an independent company from the cyber security industry – NCC Group – which not only monitors whether we fulfill our obligations, but also creates additional layers of security, controlling the flow of data within our organization. This model guarantees that user data is protected in accordance with the declarations we submit to them, and is an innovative approach to privacy in the digital platform sector.

So the thicket in Europe is audited by an external company?

Definitely yes.

And we shouldn't be afraid that our data goes to China?

Of course, we are also subject to controls from government institutions and EU bodies, but what distinguishes this model is the role of our partner – NCC. It monitors the flow of data within the platform, and what is important, it works independently of us and has the ability to publish reports on their arrangements. In the event of detecting irregularities, NCC may also directly contact the relevant government institutions, informing them about the observed incompatibilities.

And if the Tiktoka user sees some content, which in his opinion is clearly distorting, whether in the context of elections or in any other subject, does he have tools to just report it to moderators?

Yes, the content reporting system is available on tiktoku and works intuitively. The user can select a special option to report material related to the election process from the menu – for example, one that violates the provisions of electoral law or contains disinformation.

In addition, in the election center we launched on Tiktoku, we direct users to the “Safe Elections” page, created in cooperation with NASK and other platforms present in Poland. This page contains not only educational materials regarding the recognition of disinformation and threats associated with it, but also offers a special mechanism for reporting false electoral content. Applications also reach us directly, which allows for faster reaction and more effective action.

– Grzegorz Kubera talked

Ashley Davis

I’m Ashley Davis as an editor, I’m committed to upholding the highest standards of integrity and accuracy in every piece we publish. My work is driven by curiosity, a passion for truth, and a belief that journalism plays a crucial role in shaping public discourse. I strive to tell stories that not only inform but also inspire action and conversation.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button