NATO’s Stratcom center in Riga has conducted a survey on the manipulation of likes, comments and distributions, which can be purchased for very little money. People have bought “likes, comments, and other feedback on social media posts to prove how easy it is to manipulate readers on social media.”
The Stratcom Center has experimented with buying: 3 500 comments, 25 750 likes, 20 000 views and 5 100 followers on various social networks, for just 300 euros! Only from this information can it be seen how easy it is to influence public opinion through social networks, as ordinary people do not know who is behind the keyboard, and that all these feedbacks on content that are published are artificial. By analyzing the false profiles that were used for manipulation, the study authors concluded that 18 739 false profiles were used for this purpose.

The main “manipulators” are the Russian providers

The authors have investigated how social networks such as YouTube, Facebook, Twitter and Instagram have been able to identify and remove manipulations. Stratcom has acquired feedback on 105 posts on the four social platforms to see the level and manner of manipulation, employing 11 Russian and 5 European social networking service providers. European service providers are, one from Poland, two from Germany and one from Italy. Stratcom has found that access to the illegal market is extremely easy, while the size of the rigging industry is “worrying”, the study said.
“We have identified hundreds of providers, dominated by Russian players. Practically all the major software operators and manipulation infrastructure we have identified are of Russian descent, ”the study pointed out.
It also points out that the mechanisms for automatic removal of false profiles are not operational at all. Four weeks after purchasing the false feedback, 4 of the 5 false positives were still active online. This is not in favor of social platforms that are prone to self-regulation, and are committed to deleting such profiles, and reporting cases.
“Additionally, we have tested the platform’s ability to respond to information from users denouncing false profiles. Three weeks after the denunciation, more than 95 percent of the false profiles that have been submitted are still active online, ”the study said.
Self-regulation of the media does not help!
The authors’ conclusion is that self-regulation does not work. It is said that there is a need for a legal framework that will prevent online manipulation.

European Commission Vice President Vera Jurova recently issued a statement for Playbook

The authors’ conclusion is that self-regulation does not work. There is a need for a legal framework to prevent online manipulation.
European Commission Vice President Vera Jurova recently issued a statement for the Playbook newspaper saying self-regulation is not enough to reduce the impact of fake news. She has proposed that all platforms be open so that all researchers and governments can view their algorithms, and compare them with reports submitted for their work. This is only a warning of a legal solution, but no details have been provided on how it will be implemented.

logo

FINANCED BY

sponsor

This project was funded in part through a U.S. Embassy grant. The opinions, findings, and conclusions or recommendations expressed herein are those of the implementers/authors and do not necessarily reflect those of the U.S. Government.

PARTNERS

sponsor
© 2023 F2N2.