Data Access Initiative Report #1
Finding Models for Transparency and Data Access in the Case of Social Networks
During a one-day workshop on 28 May 2018 a group of experts discussed ideas on ways to model a regulatory process regarding transparency and non-discrimination in the case of social networks. This report sketches the ideas, questions and problems the group formulated.
These ideas are not necessarily positions of the Media Authority Berlin-Brandenburg. They are rather welcomed as innovative impulses that we would like to discuss and explore further.
Guiding questions of our workshop were:
1. What could a monitoring process to safeguard media pluralism look like?
2. What can we learn from existing monitoring approaches and research?
3. What do we need to keep in mind regarding data protection?
4. How can we obtain a proof of concept?
1. Which types of access to data exist for research so far?
First of all, the group aimed for a selection of existing methods to gather data on the use of social networks. Each form of access provides a different set of information and comes with different challenges:
- API with sometimes very limited publicly available data sets.
Challenge: only public content included.
- Social network apps that collect data within a social network.
Challenge: data protection issues (as in the case of Cambridge Analytica).
- Research councils such as the one Facebook is setting up voluntarily in the US.
Challenge: ensuring independence and avoiding exclusivity of access for privileged research facilities.
- Scraping tools that allow users to scrape their own content and help acquire data sets that would give deeper insight into the usage of social networks.
Challenge: breach of Terms and Conditions of social networks.
- Tools that require user participation (man-in-the-middle tools) like the project ROBIN at the University of Amsterdam.
Challenge: hard to get a representative sample and expensive to realise; so far, the monitoring of mobile data is still an open question, apps are a complete black box.
- 3rd party data brokers like ad-agencies which do not have raw data, but statistics.
Challenge: only summarised information, expensive.
- Fingerprint method which would mark certain content and indicate which users have or have not seen this content by means of a technical tool.
Challenge: potentially not representative and no detailed information about usage.
- Fake accounts which could simulate the user experience over time in full detail.
2. Proposals to monitor the impact of intermediaries on digital media pluralism
The discussion of the expert group resulted in five concrete ideas how to monitor intermediaries in general and social networks in particular. All five points add up to a holistic approach to continually and sustainably ensure the protection of digital media pluralism on social networks.
2.1 Transparency report
To create transparency, information for users is crucial. However, such information will very often simply not be read or even perceived as a disruption. This raises the question: what can be provided to actually increase transparency?
One important step consists of transparency reports. Intermediaries should be obliged to submit an annual report providing insights important for the protection of media pluralism for research and overview purposes. Its goal is to balance the information asymmetry between the knowledge intermediaries have about their users and the knowledge available to the public.
The recipients of this report should be the media authorities which could then predefine the information items provided. Possible items for this report might be:
- How many factors are part of the most relevant algorithms (such as newsfeeds)?
- What factors are there, and how are they weighted?
- Which role does journalistic-editorial content play compared to private communication?
- How important are local topics, in particular journalistic ones?
- Are measures taken to ensure the visibility of public-value content?
- Are there agreements with media producers about the ranking of their content?
- How often are the factors and criteria changed per year? Are any major changes planned?
- Who in the company plans and implements these changes?
- What is the process that a change must undergo before it is implemented?
- How many fake accounts have been deleted over the past year?
- How many notifications has the intermediary received regarding fake news? Which cases were reported?
- … and any other items that could be identified and added over time.
Intermediaries should be required to submit an annual report to the media authorities. The media authorities should define the necessary information; the items in this report can change from year to year.
2.2 Auditing process
An audit would help verify the information of the annual report and provide additional monitoring of intermediaries’ algorithms regarding media pluralism. The media authorities might involve a third party to execute this in-house audit. The audit team could be formed as a consortium, e.g. including independent researchers and trustworthy auditors experienced in similar audition processes. The media authorities would ensure the process and independent execution of the audit.
Duties of this external auditor would include inter alia:
- verification of the information provided by the annual report,
- evaluation of the respective social network newsfeed algorithm and other algorithms relevant for the oversight of media pluralism based on data provided confidentially and in-house by social networks,
- gaining insights into research conducted by social networks about their users,
- other issues that are relevant to media pluralism.
Media authorities should have the power to commission an external auditor to execute the monitoring process. Intermediaries should be required to provide all necessary information to the auditor. This must not include the algorithm itself or trade secrets protected by law, but sufficient information on relevant factors that determine the algorithm’s output.
2.3 Ad transparency
Securing a higher standard for ad transparency is a cornerstone of more sufficient transparency in general. Lately, Facebook has implemented several features to voluntarily provide transparency regarding political advertisements. However, to secure media pluralism and free formation of political opinion in the long run, additional steps will be necessary. Regarding transparency, measures such as the creation of one comprehensive Public Online Advertisement Database (POAD), source transparency (including funding sources) and distinct and immediately visible labelling are discussed.
To prevent any buying of political influence, political advertising is prohibited for broadcasters in Germany. As the influence of intermediaries on public opinion formation is almost as important, the law should establish a level playing field. If transparency is considered essential, it must be mandatory and overseen by the media authorities.
2.4 User autonomy
To strengthen user autonomy, the law should require intermediaries to provide the user with options for the curation of content. As an obligatory technical feature, the user should be able to choose between modes of information curation. To facilitate/foster that process, intermediaries should allow a certain number of external partners to co-curate their newsfeeds as a requirement to support media pluralism. One mode, for instance, would be a chronological newsfeed which allows the user to preselect his or her preferences in curation, or a mode of curation that is provided by external entities (like established media institutions).
Additionally, the option of switching between curation modes would allow researchers to come to a better understanding of a social network’s newsfeed algorithm by comparative methods without knowing the actual algorithm.
Social networks should be obliged to provide users with a choice between different modes of curation, make them easy to find and provide information on how these different modes work. Additionally, it would help the process if social networks allowed different media partners in Germany to co-curate their newsfeed algorithms.
2.5 Enforcement section
To ensure compliance with these regulatory requirements, sensible enforcement instruments are necessary. In case social networks do not comply with obligatory legal requests or due process of the audit, the law must implement appropriate measures to enforce sanctions.
3. Next steps
The expert group suggests starting a process to redefine media pluralism in the digital age. In this process it will, among other things, be necessary to find definitions of “media content” and media pluralism which can be utilised for law and regulation and to discuss measures to secure visibility of public-value content. It will furthermore be necessary to discuss if search engines and social network should be treated with a one-size-fits-all solution to regulation or if differentiation is required.
Research exemplifying the role of intermediaries case by case is crucial to better understand their role for opinion formation. In addition to singular reports on intermediaries, a more comprehensive understanding of dynamics of media pluralism will be necessary. Civil society stakeholders have proven to be an important driver for this holistic understanding and new approaches of research.
To deepen the knowledge obtained and gain further ideas of how a possible regulation of intermediaries such as social networks could look like, the Media Policy Lab is planning to organize additional workshops in the context of the Data Access Initiative.