MediaPolicyLab Grafik-aufRot-05
MediaPolicyLab Grafik-aufRot-06

#digidemos 2017: Congress for Digitisation and Democracy

At last year's #digidemos, a congress on digitisation and democracy held in Berlin on 20 June, the focus was on aspects of democracy, publicity and work within a digitised society. There also were discussions on new forms of societal communication and participation with over 50 speakers at nine parallel expert forums and a total of 400 people in attendance.

© Gerngross Glowinski Fotografen

One agenda item of the event organised by the Friedrich-Ebert-Stiftung was the panel “Algorithm-based publicity and values”, featuring Nele Heise (research assistant at the institute of journalism and communication science at Universität Hamburg), Dr. Anja Zimmer (Director of the media authority Berlin-Brandenburg) and Peter Welcherling (journalist). The panel was moderated by freelance journalist, author and editor Christine Watty.

The discussion dealt with algorithm-based publicity, where artificial intelligence (co-)determines our decision-making as well as media pluralism and editorial standards. Jointly, the panel participants reflected on the question of which values our society wants to base publicity on, and which reactions need to go along with that in the digital age.

The complete panel discussion on “Algorithm-based publicity” with Ms Heise, Dr. Zimmer and Mr Welcherling at #digidemos 2017 can be viewed here:

 

In this context, Dr. Anja Zimmer has written an essay on “automated publicity”:

Automated publicity: why we must rethink media pluralism.

Media pluralism is indispensable for an animate political public, and it is unimaginable without critical journalism and media in publicist competition. Journalistic offers need reliable funding and retrievability. The question of how we can preserve these for the future will be one of the most important questions for media regulation. Legislators, media authorities and science must develop new approaches to this together.

Publicity in digital society is changing rapidly: For many people, the web 2.0 was a synonym for the decentralised, open and participative culture of the Internet. A new and possibly more democratic public space was the great hope. Today, however, we must instead debate basic questions and challenges to society, politics and regulation. Many discussions deal with the question of what we can do against the increase in “hate speech”, or how to handle “fake news”. Another controversial topic is the question of who determines or should determine what users get to view and hear on the Internet, and which information they are offered.

One term is in the spotlight: algorithms. And the question of what role they play in the context of opinion making and pluralism of opinion. This is due to the fact that algorithm-based decision-making systems have taken on various functions in our everyday lives: From financial market mechanisms and driverless cars all the way to partner selection – we live in an increasingly automated public sphere, where algorithms can influence our decisions, and our opinions as well. The exact way in which Facebook, Twitter or Google – so-called information intermediaries – use algorithms remains hidden among their business secrets. Which is understandable to a certain extent, since it is their capital. Intermediaries collect personal data from users, accumulate it and analyse it based on fundamental algorithms. These algorithms are continuously improved using data streams. Furthermore, they self-develop further using artificial intelligence, and thus can achieve a complexity which, according to voices from within IT corporations, even surprises those on the inside.

All of this changes our lives not only on an individual level, but also in societal interaction, e.g. if our individual actions are inspired, evaluated or pre-selected by algorithms. And, as often, media and media use are the focal point of these developments.

Intermediaries do not make editorial decisions – but they still wield influence

The offers of information intermediaries doubtlessly create more variety via new journalistic offers, user-generated content and information exchange in individual networks. And without algorithm-based decision-making systems, we would not be able to cope with the information flooding, and variety would be unimaginable. However, information intermediaries also have an influence on which topics come to our attention, what coverage is achieved by information and which media are still incorporated in our communication mix.

This poses new challenges for the preservation of pluralism. Increasingly so, since the companies acting as intermediaries between content providers and users on the Internet usually hold considerable market power which is only enhanced by networking and lock-in effects as well as access to enormous quantities of data.

And even if intermediaries do not make editorial decisions and do not take the place of established media companies, they are far from neutral technology platforms at this point. Through their technical mode of operation and commercial self-interest they do influence media use and thus opinion making in users.

Currently, there is no case for information intermediaries to exert targeted political influence on the public, which would fundamentally be a disservice to their business model. However, to maximise their profits, they require attention – which, besides data, constitutes the currency of the digital market economy. This requires keeping users on their platforms for as long as possible. Accordingly, content that pleases users is good – but content that makes them interact is even better. Matching content can be placed profitably using algorithm-based personalisation. Whether that content consists of high-quality journalistic content or fake news has not mattered much so far.

A question which remains unanswered for the most part is how media consumption will develop, and what consequences that will have on individuals, various groups or society as a whole. Initial studies show that young people increasingly use information intermediaries as an important or even exclusive source for news. At the same time, social networks demonstrate potential for intensified polarisation, which can e.g. change the perception of the climate of opinion and fuel media hypes. Corresponding effects can be demonstrated more distinctly in some groups of the population more than others.

And even if the information repertoire is currently still ample in Germany, the dangers for opinion making must not be underestimated. Is it really sufficient to define pluralism of opinion in a rather static and linear way, or does it maybe depend more strongly on actual use? Questions like this must be answered soon, since pluralism of opinion can only be preserved in preparation. Once damaged, it can hardly be restored. This is why immediate action is so important.

Developing new approaches together – ideas for modern regulation

We are in agreement: our information landscape is no longer imaginable without algorithms. They have many positive properties and support us in keeping track of the information flooding when using media. Hence, their elimination cannot be the objective. We must reflect on which principles need to apply for algorithm-based decision-making power in a liberal, democratic and constitutional society. How can we preserve pluralism of media and opinion, and how can we strengthen individual autonomy in dealing with algorithms? There are no easy answers to these questions, but there are starting points:

First of all, there must be systematic transparency. This cannot mean that search and recommendation functions have to disclose their algorithms, since they could then no longer perform their task. However, platforms must at least provide transparency on their own role, e.g. by declaring their values and guiding principles. They can then be measured against their own statements. It is similarly important for scientists and media authorities to receive data insights, e.g. to be able to trace the criteria and mechanisms of algorithm-based decision-making entities which influence our media pluralism. Fundamentally, results are only as good as the data situation allows them to be. And at the moment, there are grave deficits in this respect.

Furthermore, there is a consensus on the importance of safeguarding freedom from discrimination. But what does that mean, exactly? In times of increased personalisation, new concepts are required here. There should be an easy consensus on the fact that providers with a certain power on the market cannot privilege their own products. However, what importance is placed e.g. on context? Would there have to be different criteria of relevance for purely service-related information? What is the rank of journalistic offers? Would there need to be particular rules for retrievability? Is it possible to develop shared values for that? These topics require societal debate and a development of benchmarks.

It may be necessary for regulatory authorities to rely more strongly on technological means in the future. This could strengthen users’ freedom of choice if they were left to decide autonomously which selection tools to employ to receive news or to interact with others. On platforms with market power, this would require e.g. data portability and open interfaces for competitors or open source projects. And maybe, one day the motto could be “bring your own algorithm” ...

Here you can download the whole essay as a PDF:

Published: 13.12.2017

<< Back

Cookies enable the best possible provision of our online offer.
By using the website www.mabb.de you agree that cookies are used.