Facebook’s Role in European Elections Under Scrutiny

Lawrence Dodd lives in one of Britain’s most militant electoral districts and has been almost daily covered with ads from major political parties on Facebook. About a month ago, he tried to find out why.

Mr. Dodd, a musical instrument manufacturer in northern England, joined the experiment. He and some 10,000 others provided their data on a voluntary basis, enabling real-time researchers to track which political ads appeared in their Facebook newsletters when Britain’s elections were approaching the election.

Their goal is to highlight how Facebook and other digital services are using policy campaigns – technologies that quickly change the democratic process, but often offer little detail about their broad roles in elections around the world.

“These political ads are not regulated, no one knows what’s on Facebook,” said Dodd, 26, who planned to vote for the Labor party on Thursday, but which was further bombed by conservative online reports. “More transparency is needed everywhere where politics is concerned.”

Facebook has little to say about how political parties use ads to get voters on the spot. And after the US presidential election, there was growing concern about the role of society in its campaigns, including the spread of false news on the Internet.

Now that voters go to elections all over Europe, groups in Britain, Germany and elsewhere are fighting back and creating new ways to monitor and track digital political ads and misinformation in the social network and other digital services such as Twitter and Google.

The political ads that Mr. Dodd has demonstrated are linked by the WhoTargetsMe, a non-political group that has designed a digital tool to monitor Facebook’s roles before the UK elections.

For less than $ 1,000, technology that acts as a plug-in for desktop web browsers and anonymizes user personal information was created because the social network does not provide information about political ads displayed by more than 36 million UK users, roughly half of the country’s population.

This lack of information has raised concerns about Facebook and politician activities in a country where campaigns are highly regulated and political funding is tightly limited.

Issues related to the role of the social network in politics are particularly brutal in Britain, where they have been accused, among other things, of rising spending on Facebook during a campaign expired before a referendum on country membership in the European Union. In response, the UK Privacy Guard has launched an investigation into whether such targeted advertising has violated strict data protection rules.

“Political advertising is fundamentally different, there is a lot of concern about what is visible on Facebook,” said Sam Jeffers, co-founder of the group and former digital media strategist. “People deserve a sense of what’s going on.”

Given that the volunteer group is not the only representative of the British population, the data is by no means perfect and highlight the difficulty of monitoring political activity on Facebook.

For example, election results have shown that Liberal Democrats, who are likely to remain a minority in parliament, have posted the largest number of Facebook advertising ads. The conservative party was second, despite the political party’s commitment to spend $ 1 million (1.3 million dollars) on social media reporting. The Labor Party, which was planning to spend a similar amount, was third.

Initially, all British parties spent money on messages with large footprints that covered the social network without focusing on particular voters. But as the election day approached, the strategy began to change.

The data analysis published by the agency, the Bureau of Investigative Journalism, has revealed that the country’s main parties are increasingly focusing on specific electoral districts and scattering electoral groups through direct Facebook ads. Number of ads seen by WhoTargetsMe? Volunteers have also roughly doubled over the past month, although political messages still accounted for 2 percent of total ads displayed in Facebook sources, according to a group analysis.

The ads featured news from the Conservative Party about possible jobs in the nuclear field in three areas of northern England that have industrial ties, and these are some of the most affected districts in the country. On the contrary, the Labor Party focused on older women at the national level and managed advertisements about potential threats to their pensions.

“It’s a crucial interview on how we regulate it,” said Nick Anstead, media and communications expert at the London School of Economics. “Facebook has a responsibility to tell its users who buy an ad that focuses on their voices.”

In response, the company claims that its roughly two billion users worldwide have full control over what ads appear on the network, and that it is the responsibility of individual political parties to abide by the electoral laws of their countries. Facebook adds that its business dealings and the privacy of individuals restrict sharing of further information on how the information is distributed on the platform.

“Facebook’s goal is to make it easier for people to get the information they need to vote,” said Andy Stone, a spokesman for the company. “We call on all and all candidates, groups and voters to join our platform in elections.”

Facebook and other technologies have tried to improve what is shared and expanded on-line, creating partnerships with intelligence sites to detect digital falsities, and breaking how counterfeiting of intelligence websites earns money through social media advertising. The social networking giant also sponsored campaigns in which polls were cast and encouraged political groups to create Facebook sites to promote their news.

Yet, during the recent French presidential election, which put the candidate for centrist Emmanuel Macron against the hopeful Marine Le Pen, several media organizations, including Le Monde, said it was difficult – and too cumbersome – to report possible false messages about Facebook Candidates.

Academics and others who examined the polls also said that the company did not provide data on what Facebook users shared in France, it was virtually impossible to determine whether false messages spread over the network that affect the overall result.

“The lack of Facebook transparency is a big problem,” said Tommaso Venturini, a research scientist at the University of Paris, Sciences Po, in Paris, who watched fake news in the social media during the French elections.

For Ben Scott, such questions bring mixed memories of the US presidential election as a digital consultant to the Hillary Clinton campaign.

He now focused on a project at the New Responsibility Foundation, a Berlin research organization that monitors the dissemination and impact of fake reports before the German elections in September.

He and his team categorize potential online disinformation in the digital database, watch how these fake reports spread across the social media and the broader web, and lead focus groups to measure the impact on voters’ choices.

The role of companies, such as Facebook, in spreading online fake data is limited in Germany, said Mr Scott, because social media plays an important role in day-to-day politics such as the United States.

The social media giant, which has around 36 million users in Germany, is a force that counts in the upcoming elections.

Despite a discussion with Mr. Scott about Facebook about potential collaboration, the company has so far refused to provide the research project with any data on how local users share potential misinformation among themselves in the social network.

“If we see something that gains significant media attention and is a sudden tip,” Mr. Scott said, “then we can guess that something is happening on Facebook.”

Leave a Reply

Your email address will not be published. Required fields are marked *