NewsPronto

 

News

  • Written by Shelley Hepworth, Section Editor: Technology, The Conversation

A group of 20 academics based at universities around the world have written an open letter to Facebook, calling on the company to rethink how it engages with the research community.

In the wake of recent controversies over privacy, Facebook recently announced restrictions to third-party access to public user data via its Application Programming Interfaces (API).

At face value it seems a step in the right direction for users. However, the move has raised some concerns with academics due to implications for their research activities through the platform.

Read more: Online privacy must improve after the Facebook data uproar

“Clearly the aim of this is to try and protect user data, and that’s perfectly appropriate,” said Axel Bruns, coauthor of the letter and Chief Investigator in Queensland University of Technology’s Digital Media Research Centre.

“But at the same time it also significantly reduces the opportunity for independent scrutiny of what happens on Facebook. It actually positions Facebook even more as the gatekeeper of what kind of research can be done on the platform.”

Facebook supports research on election impact

As part of the changes, Facebook announced it will support a research initiative to shed light on social media’s impact on elections.

The research will be independent of Facebook and funded by a number of philanthropic foundations, including the Laura and John Arnold Foundation, Democracy Fund, the William and Flora Hewlett Foundation, the John S. and James L. Knight Foundation, the Charles Koch Foundation, the Omidyar Network, and the Alfred P. Sloan Foundation.

Facebook views the initiative as a model for partnerships between industry and academia, stating in a blog post that:

In consultation with the foundations funding the initiative, Facebook will invite respected academic experts to form a commission which will then develop a research agenda about the impact of social media on society — starting with elections.

Separate rules for academics

But, while they welcome the access, Bruns and his letter co-authors say the initiative is too limited. They are concerned the panel will be skewed towards known researchers based in the US, while overlooking innovative research conducted on little known topics by undiscovered researchers.

“If you think of fake news and foreign influence operations, a few years back no one would have thought that this was a major issue – because Brexit and Trump hadn’t happened,” said Bruns.

“There were already people in the field trying to investigate this on these platforms, but Facebook or Twitter would not necessarily have run any data access grants built around these sorts of topics.”

Read more: We need to talk about the data we give freely of ourselves online and why it's useful

Currently all third-parties are subject to the same restrictions on access to Facebook user data. The letter authors are calling for Facebook to develop specific rules that grant wider access to academics.

They note that academics are required to obtain ethical clearance for their work, which is not the case for commercial app developers or market research companies.

What kind of research are we talking about?

Following Facebook’s announcement, Danish researcher Anja Bechmann put out an informal call to compile a list of published research that would not have existed without access to data from various social media platforms.

To date, researchers have added more than 120 studies spanning topics from how protests are organised on social media, to the way misinformation spreads online, and the role Facebook plays in the transition to parenthood.

This is the kind of research academics are concerned could suffer due to data access restrictions.

Should users be worried about this kind of access?

While few would argue against more scrutiny of how social platforms operate, some privacy experts believe the protection of user privacy should prioritised above this kind of research.

David Vaile, chair of the Australian Privacy Foundation, points out that an academic was involved in the data breach at the heart of the Cambridge Analytica scandal – although he was apparently working in a personal capacity at the time.

Read more: How Cambridge Analytica’s Facebook targeting model really worked – according to the person who built it

Another group of researchers conducted a research project on Facebook about emotional contagion, which reportedly targeted people online without consent and in the absence of university-standard ethical oversight.

Vaile is concerned that the research community itself hasn’t sufficiently reckoned with or faced up to what went wrong in those instances. This includes conflicts of interest, blindness to potential risks for subjects and other ethical considerations.

“I’d be very cautious about saying we know enough about how they’ve worked in the past to know whether we’ve learned all our lessons,” Vaile said.

Vaile cites IT security authority Bruce Schneier’s concern for sensitive data not being “the new oil” but a “toxic asset”, and questions whether supposedly de-identified personal data can ever be properly protected. He says we live in an environment where big data and AI tools that can infer identity from anonymous data and the proliferation of other data “in the wild” are becoming more sophisticated all the time.

Read more: Facebook data: why ethical reviews matter in academic research

“Everyone wants trust, nobody wants to ask what being trustworthy means,” Valie said.

“To me, it means you’ve done the hard work, the global vulnerability research, the technical probabilities and statistics to frankly take part in the discussion that might have to say: actually we might not be able to protect this.”

In the absence of any assurance that data can be guaranteed to be permanently and reliably protected, Vaile advocates “doing more with less”. That means collecting less and retaining less, and using those new tools not to “collect it all” but to make the most of the minimum possible.

“A more up-to-date, future-looking solution would be to say: let’s work with what we’ve got already, and try to really understand that,” he said.

Recent events have spurred a new appreciation for the impact social media can have on society. There are increasing calls to regulate Facebook, even as it becomes apparent that many lawmakers don’t fully understand much of the technology that underpins it.

Decision-makers must now work out where the line should be drawn between protections for user data and research that could make companies such as Facebook more accountable.

Read more http://theconversation.com/academics-call-on-facebook-to-make-data-more-widely-available-for-research-95365