Lawrence Dodd lives in one of Britain’s most fiercely fought voting districts, and he has been peppered almost daily with ads from the country’s major political parties on Facebook. About a month ago, he tried to find out why.
Mr. Dodd, a maker of musical instruments in northern England, joined an experiment. He and around 10,000 others volunteered their data, allowing researchers to monitor in real time which political ads were showing up in their Facebook news feeds as Britain’s election approached.
Their goal: to shed more light on how political campaigns are using Facebook and other digital services — technologies that are quickly reshaping the democratic process, but which often offer few details about their outsize roles in elections worldwide.
“These political ads aren’t regulated; nobody knows what is being said on Facebook,” said Mr. Dodd, 26, who planned to vote for the Labour Party on Thursday elections, but who continued to be bombarded with online messages from the Conservatives. “Wherever politics is concerned, there needs to more transparency.”
Facebook provides little information on how political parties use ads to reach undecided voters on the site. And concern has been growing since the American presidential election about the company’s role in campaigns, including about how politically charged fake news is spread online.
Now, as voters head to the polls across Europe, groups in Britain, Germany and elsewhere are fighting back, creating new ways to track and monitor digital political ads and misinformation on the social network and on other digital services like Twitter and Google.
The political ads shown to Mr. Dodd are being tallied by WhoTargetsMe?, a nonpolitical group that designed a digital tool to monitor Facebook’s role ahead of the British election.
Costing less than $1,000 to build, the technology, which works as a plug-in on desktop web browsers and anonymizes users’ personal information, was created because the social network does not share information on political ads shown to its more than 36 million users in Britain, roughly half the country’s population.
That lack of information has raised hackles about the activities of both Facebook and politicians in a country where campaigns are highly regulated and political financing is tightly capped.
Questions over the social network’s role in politics are particularly raw in Britain, where outside groups were accused of spending lavishly on Facebook during a heated campaign before a referendum on the country’s membership the European Union. In response, Britain’s privacy watchdog has started an investigation into whether such targeted political advertising breached its strict data protection rules.
“Political advertising is fundamentally different; there’s a lot of concern about what’s being seen on Facebook,” said Sam Jeffers, the group’s co-founder and a former digital media strategist. “The people deserve some sense of what’s going on.”
As the volunteer group is not completely representative of the British population, the data is by no means perfect, highlighting the difficulty of tracking political activity on Facebook.
In the buildup to the election, for instance, the data showed that the Liberal Democrats — who are likely to remain a minority presence in Parliament — posted the largest number of political ads on Facebook. The Conservative Party was second, despite the political party’s pledge to spend 1 million pounds, or $1.3 million, on social media messaging. The Labour Party, which planned to spend a similar amount, was in third place.
Initially, all the British parties spent money on broad-brush messages that blanketed the social network without targeting specific voters. But as Election Day approached, that strategy began to change.
An analysis of the data by The Bureau of Investigative Journalism, a nonprofit media organization, showed the country’s major parties were increasingly targeting specific voting districts and wavering voter groups with direct Facebook ads. The number of ads seen by WhoTargetsMe? volunteers has also roughly doubled in the last month, though political messages still represented 2 percent of overall ads displayed in Facebook feeds, according to the group’s analysis.
The ads have included Conservative Party messages about potential nuclear energy jobs in three areas in northern England with ties to the industry, and that are some of the country’s most contested districts. By contrast, the Labour Party targeted older women nationwide with directed ads about potential threats to their pensions.
“It’s a fundamental conversation to have about how we regulate this,” said Nick Anstead, a media and communications expert at the London School of Economics. “Facebook has a responsibility to tell its users who is buying advertising that is targeting their votes.”
In response, the company says its roughly two billion users worldwide have complete control over which ads they are shown on the network, and that it is the responsibility of individual political parties to comply with their countries’ electoral laws. Facebook adds that its commercial agreements and protection of individuals’ privacy restrict it from sharing more data on how information is distributed on the platform.
“Facebook’s goal is to make it easier for people to get the information they need to vote,” Andy Stone, a company spokesman, said in a statement. “We encourage any and all candidates, groups and voters to use our platform to engage in the elections.”
Facebook and other technology companies have tried to improve what is shared and circulated online, creating partnerships with news outlets to debunk digital falsehoods and cracking down on how fake news websites make money through advertising on social media. The social networking giant also sponsored get-out-the-vote campaigns, and encouraged political groups to create Facebook pages to promote their messages.
Yet during the recent French presidential election, which pitted the centrist candidate Emmanuel Macron against the far-right hopeful Marine Le Pen, several media organizations including Le Monde said they had found it difficult — and overly cumbersome — to report potential fake news items about the candidates to Facebook.
Academics and others scrutinizing the vote also said the company’s failure to provide data on what Facebook users in France shared among themselves made it virtually impossible to determine if false reports spread on the network affected the overall result.
“Facebook’s lack of transparency is a big concern,” said Tommaso Venturini, a researcher at the médialab of Sciences Po, a prestigious university in Paris, who tracked fake news across social media during the French election.
For Ben Scott, such issues bring back mixed memories of the American presidential election, when he was a digital consultant for Hillary Clinton’s campaign.
He has now turned his attention to a project at the New Responsibility Foundation, a Berlin-based research organization that is monitoring the spread and impact of fake news ahead of Germany’s election in September.
He and his team are categorizing potential online misinformation in a digital database, tracking how these false reports spread across social media and the wider web and conducting focus groups to gauge the impact on voters’ decisions.
The role of companies like Facebook in spreading online falsehoods is limited in Germany, Mr. Scott said, because social media does not play as significant a role in everyday politics as it does in the United States.
Still, the social media giant — which has roughly 36 million users in Germany — is a force to reckon with in the coming election.
Despite Mr. Scott’s discussions with Facebook about potential collaborations, the company has so far refused to give his research project any data on how local users share potential misinformation among themselves on the social network.
“If we see something getting significant media attention and there’s a sudden spike,” Mr. Scott said, “then we can guess there’s something going on inside Facebook.”