Washington and Brussels call for far-reaching regulation for Facebook: Above all, it is the opinion of the world’s largest social network that politicians are suspicious of. But with increasing political intervention, the risk increases that Facebook will become an organ controlled by the state and special interests.
Don’t let the politicians lie – fact-checking is a must
It was only a matter of time before social networks themselves would become the subject of the US election campaign. Because at some point, a politician would demonstratively exceed the limits of what is still permitted on platforms such as Twitter or Facebook according to the company’s guidelines. And that has now happened: Twitter recently provided a tweet by the President of the United States about the susceptibility of postal elections to fraud with a “fact-checking label” and thus offered users further information on the subject, some of which was contrary to Trump’s post. Facebook even had a number of its posts deleted, which dealt with the problems of the left-wing militant network Antifa and which had used symbols that were also used by the National Socialists.
President Trump sees the measures as an interference with freedom of speech and wants to prevent such interference with the content published on the platforms in the future with the recently issued executive order “Protection against online censorship”. Meanwhile, leading employees of Facebook, parts of the advertising industry, many politicians and parts of the population of Facebook and Twitter are demanding exactly the opposite: “Intervene more in the content, don’t let the politicians lie – fact-checking is a must”, says they argue.
The current regulation of social networks in the United States
In general, a dispute has arisen in the USA about how social networks should deal with content. It is hardly surprising that there is no consensus in this polarized country about what is only keen rhetoric or perhaps a call for violence, and that in the hot climate a post is quickly referred to as a false message or disinformation, even though the content is only controversial. The Americans will elect their President in five months. And the political establishment is shaking with the influence that Facebook, Twitter and Youtube could have on the outcome of the elections. Whether politicians, lobbyists, civic associations, trade unions, advertisers or employees: in the end, they all want to be in charge of what can be posted on Facebook, Twitter or Youtube. Private opinion now needs state borders – social media regulation is required. Such efforts are also being made in Brussels.
But instead of regulation that offers politicians a gateway, it needed framework conditions that strengthen responsibilities and market mechanisms. Do we want the political influence on such essential opinion platforms like Facebook or YouTube to increase? Which opinion will be considered acceptable on Facebook in the future, if discussions about topics like racism, environmental protection or American politics often threaten the emotional, moralizing club of opinion?
What was needed rather than the influence of moralizing opinion leaders on Facebook would be a competition-promoting regulation that, first of all, transferred to Mark Zuckerberg what an entrepreneur had to carry: responsibility. Secondly, it would be a matter of adjusting the business model, in which the user would finally have to become a customer (today he is the product). And thirdly, any regulation should start from a responsible and courageous media consumer and strengthen them.
The current regulation of social networks in the United States dates back to the childhood days of the Internet. Long before Mark Zuckerberg founded Facebook, the American legislature had already created the ideal legal conditions for its success. At that time, in 1996 – Zuckerberg was just eleven years old – the Communication Decency Act (CDA) was passed in the USA. It was the time of the first Internet service providers (ISP), and the main aim was to prevent pornographically but also other inappropriate content from poisoning the climate on the Internet. Article 230 decoupled the right to intervene from the obligation to be ultimately responsible for the remaining content:
According to popular belief, these 26 words created the Internet. They became the legal business foundation of Facebook, which was founded in 2004. The social network is still a neutral platform on which users post their content. The company assumes responsibility for this to a minimal extent by referring to Article 230. There are no limits to Facebook’s growth: Today, several hundred million posts are posted on Facebook every day; 2.8 billion people access the social network every month.
Facebook, like a publisher, has to take responsibility for what is seen on its pages is
However, Facebook generally only has to assume responsibility for what happens on the platform in exceptional cases. And if you don’t want that at all, you’d better retreat to the position of neutrality. Facebook’s impartiality or independence is an illusion. Of course, the company intervenes massively in the news feed – this is the stream of content that reaches the user on his own Facebook page. Facebook’s algorithms control what the user gets to see when and how often. Your goal is to keep the user on the page as long as possible. Because the longer he is on the page, the more advertising Facebook can show him. The more Facebook can learn from the user (via his posts, the likes distributed by him, the Internet pages that he uses when he has long left Facebook has, etc.), the more expensive it is to sell advertising space to advertisers. Ultimately, the user is the product that Facebook sells to the advertising industry.
The group is increasingly countering the criticism that is increasingly permeating Facebook that it is a platform for false information and campaigns for disinformation, with self-regulation: General rules of conduct, fact-checkers and a newly established oversight board are intended to walk the tightrope walk between freedom of movement resulting from the CDA law Manage opinions and exceed the limit of what is permissible. Facebook now pays a whole army of external, supposedly objective fact-checkers. They check controversial posts for their truthfulness. Depending on the result, Facebook removes such positions, or the algorithms ensure that they slide far down in the newsfeed and become virtually invisible. And the twenty-member oversight board, which is staffed by external experts is to decide in disputes and doubts whether the content has been rightly removed or marked with comments from Facebook.
Self-regulation
But this type of self-regulation is hugely problematic. First, Facebook is getting very close to publishing. Second, it is based on the false assumption that there is only one truth and that facts are separate from opinion. However, an allegedly objective assessment of a controversial claim does not necessarily refute the latter.
Against this background and given the increasing attempts by politicians to exert institutional influence on Facebook and its content, it would be time to create clear responsibilities: Facebook, like a publisher, has to take responsibility for what is seen on its pages is. This responsibility would mean Facebook’s duty to remove illegal content. It would also give the company the freedom to edit, classify (according to fact or opinion), curate and select content according to its style. That would massively change the face of Facebook today. Facebook could no longer hide behind the illusion of neutrality, and users could no longer be deceived by it. It would probably be assumed that Facebook would become less attractive as an advertising platform for the advertising industry. Users might be asked to checkout. But with that, they would finally become a product and a customer, and Facebook would become their product, not an advertiser.
At the same time, competition in the social network market should finally be stimulated: it should be possible to switch from Facebook, including all contacts and connections – the so-called social graph – to another social network. Just as in the mobile phone market the right to take the phone number to another provider that made the competition play properly, the portability of the social graph should be guaranteed. The design of the various social networks would then be guided more by user demand and not by politics and interest groups, as is to be feared in the future.
But the social media user cannot avoid one thing: he must act as a responsible and courageous media consumer. It is not brave to arbitrarily topple historical statues from the pedestal in virtual rooms, to ban opinions that deviate from the mainstream or to raise demands for the protection of minorities or the environment to a quasi-religion. It is also not brave to indulge in the illusion that there could be a Facebook that will provide users with the ultimate truth as an independent platform. Responsible citizens dare to deal with the ideas of dissenters on social networks, and they can do it rationally and critically.