Academic Master

Sociology

Final Analysis Paper: Social Media’s Double Standard

Internet and social media technologies began to be used in elections in 1996 in the USA when the first presidential candidate’s election site appeared. In the 2000s, digital technologies allowed the introduction of fundraising, and the mobilization of the electorate began via the Internet. In the late 2000s, social networks began to take the first place in order to promote candidates, in the beginning of 2010 mobile technologies and Big Data began to appear. Since the middle of 2010, mobile messengers are beginning to develop as pre-election tools.

The most indicative is the election campaigns of Barack Obama and Donald Trump. In 2008, Obama’s headquarters began using mobile technologies in conjunction with social networks. According to research, at that time 88% of users of social networks had the right to vote had mobile phones. At McCain’s headquarters, work on Facebook and YouTube began with a lag of 7 months and working with Twitter – with a lag of 1 year and 4 months. A number of technologies, including mobile ones, were not used at all.

The pre-election site of Obama has since 2007 allowed the registration of a personal cabinet, subscription to thematic mailings, track the calendar of campaign events, while the content of the site has changed depending on the geolocation of the user. In addition, through the private office, it was possible to make contributions, sign up for volunteers, and mark the events of interest. Also on the site, there were links to all accounts on social networks. Within the framework of the 2008 election campaign, Obama’s main strategy on the Internet was the presence on the maximum possible number of sites, including industry and niche sites: for example, LinkedIn and Reddit. Within the framework of this strategy, segmentation of the electorate was carried out and various platforms were used to work with them.

In the presidential election in the US in 2016, the situation repeated, but the advantage this time was on the side of Republican candidate Donald Trump. And the involvement of users of social networks was very high, and Trump’s campaign won not only in quantitative terms but also in qualitative ones. For example, the page of Trump on Facebook had more subscribers compared to the pages of Hillary Clinton, but also led to the number of likes, comments, and reposts. It should be noted that in the statistics of 20 posts about Trump on Facebook, only one was positive. It had and has a viral effect on the personality of the current US president. The political year in 2017 is marked by an increase in the role of instant messengers, in particular, the political Telegram channels that have proliferated as mushrooms have become a phenomenon, through which anonymous or not very anonymous insides spread with varying degrees of probability. At the same time, political scientists highly appreciate Telegram as a means of influencing the political situation and political class because the information that is broadcast there is not available on official channels.

It is fermenting in me since two weeks ago the scandal around Facebook, Cambridge Analytical, and the misuse of 50 million datasets got rolling. Because a significant part of the coverage on the subject is wrong and it needs to be sorted a little bit there. So far, the story often goes like this: Facebook had one of the biggest data leaks in the history of the social network. The company Cambridge Analytica was able to tap 50 million records that influenced the election in the US in favor of Donald Trump. In response to the data leak, companies such as Tesla and SpaceX have deleted their appearances on Facebook. The social network itself is responding to the massive pressure and wants to give its users better and easier control over their data over the coming months.

Five years ago, the American neuroscientist came up with an idea on Facebook: he wanted to collect data on users for scientific purposes, using an app within Facebook. He then invented a psycho-test called this is your digital life. Around 270,000 people used this psycho assay and – for that! – knowingly agreed to read out data such as the place of residence and the likes on Facebook. The confirmation window for it you know for sure. It also appears if you play Candy Crush, want to sign in to other online services like spottily, or Tinder with your Facebook login. Anyone who submits his data to an app within Facebook will knowingly provide it. So this is not a data leak. The problem was that the app thisisyourdigitallife could not only evaluate the data of its approximately 270,000 users, but also those of their friends. This meant that it was no longer 270,000 records, but 50 million. This could happen, first of all, because Facebook gave its app developers this opportunity in principle at that time. Secondly, the 49.7 million people whose records are also affected had not fought through the social network attitudes. Had they done so, they would have hidden in the menus, the possibility to prohibit the transfer of data to apps of their friends. The feature that apps were allowed to suck the data of friends, Facebook turned off the juice in the spring of 2015, even after it had long caused serious concerns with privacy advocates. Until then, however, many apps are likely to have subtracted the data. So it would not be ruled out that there are more of this kind waiting next to the Cambridge Analytical case, if not so big and so political.

It was not a data leak, but a feature of Facebook. The problem is first of all the users, who are too lazy to go over their attitudes, and second, Facebook, who left this door open for data sharing for years.

If Facebook at this point therefore as a victim, as perception is then; more often, that is true. However, over the years, Facebook has apparently failed to rigorously control its developers – and has handled the Cambridge Analytica case quite formally. Facebook knows that Kogan has sold data to the company since 2015 and has only been assured by both of them in writing that this data has been deleted – which did not happen. Here a harder gait would have been appropriate; former Facebook employee Sandy Pakrilas sums up the problem in the Guardian:

Once the data left Facebook servers there is no control, and there’s no insight into what’s going on. However, Facebook has not sold data. Facebook sells our data. After all, that’s their business model. People have been talking foolishly over the past few years on this point: it’s not Facebook’s business model to sell our data. On the contrary, it would be detrimental to the business model. Facebook earns $ 40 billion a year in personalized advertising. For example, advertisers can place an ad on Facebook for Norah Jones fans who are between 30 and 40 years old, male, and living in Dormagen. But advertisers do not realize that it’s me who can see them. If the data were once sold and circulated, they would not be so valuable anymore. So Facebook sits on his data treasure. Facebook does not sell user data, only access to the respective users. The data transfer to app developers was and is therefore a potential threat to their own business model.

The Cambridge Analytica article cited evidence from only two people, namely inventors and sellers of Big Data methods. The article mistook correlation and causality – a popular mistake. In addition, the article concealed that Ted Cruz had dropped in the Cambridge Analytica Republican primaries in the middle of his campaign – apparently because he was not satisfied with the results. The necessary relativizations would remain in the text – such as the Cambridge Analytica abdominal landings and their controversial role in the Brexit and Cruz campaigns.

To this day, there are doubts that Cambridge Analytica has been able to influence elections and popular referendums, even after the Guardian’s coverage kicked off the Facebook data scandal. Jürgen Hermes works in the spectrum of science, as the company promises a lot, but always remains very vague when it comes to how these promises are supposedly implemented. Proof of this remains Cambridge Analytica to this day guilty. Instead, the British channel Channel 4 shows that the company relies on other methods than just data analysis:

How reliable are the claims of a company that uses such methods? His manager responded to the question of whether it could get compromising details about the political opponent, answering that one could send girls to the candidate’s house, and Ukrainians were very nice, who offers to give money to a candidate for his election campaign, to offer him a piece of land in return – and record it on video?

Not all this means that we should not worry about targeting and manipulation. With all the data, especially Facebook, we have to remain vigilant – even after the psychological experiments that Facebook itself has conducted with users. We should also keep a close eye on Cambridge Analytica. However, the story of this company also means that it brings conspicuously many clues for an air number. Political targeting should work – but maybe sooner, if you have the possibilities, which only Facebook has. When it comes to Cambridge Analytica, there are still too many questions left.

The bad Facebook on the one hand has appalled companies and users on the other side. Highlight: WhatsApp co-founder Brian Acton called on Twitter to turn his back on Facebook – and Tesla and SpaceX boss Elon Musk stepped on it. After that it was almost everywhere, the Facebook pages were deleted. You can not just delete Facebook pages. You can also disable it. Hardly any report went into the possibility that you can probably put the pages back online anytime. They each had around 2.6 million fans; It would be madness to take the opportunity to reach it via Facebook. So my suspicion: a public relations stunt.

Facebook announced that the privacy tools will be redesigned. On the smartphone, the settings were soon not distributed to almost 20 different subpages, but in one place accessible. Outdated settings options are being revised. Among other things, it should help users to understand which information can be shared with apps and which can not. Left old, right now: the privacy tools for Facebook. It was said in the coverage this week that often, with this change, Facebook react to the data scandal. That is wrong. Facebook itself states in its announcement that it has been working on a large number of these updates for quite some time – and indeed it had announced the new privacy tools in January. Anyone who is familiar with the subject knows that it is not possible to revise central functions such as privacy tools in just one and a half weeks. For the development of this kind, entire development teams usually take several months.

Of course, Facebook should be happy to see people using the revised privacy tools to respond to the data scandal. The timing is good. Just as Facebook has often chosen him well: Facebook has to listen to criticism. Facebook admits that everything is too complicated. Facebook is building. That was already in 2010. That’s how it is today. To the outside, it sounds good when being rebuilt. Internally, however, it can also confuse users when things are constantly changing – who wants to roam through new menus with regularity? And then there is the announcement from Facebook, that it will now be easier to download your own data – after all, it is your data, writes Facebook. Because this function will make it possible to download a copy of your own data in a format with which you should be able to upload it to other online platforms again.

That sounds like a great, open, accommodating step from Facebook. However, the corporation conceals that this function is in fact a statutory requirement – if the new General Data Protection Regulation in the EU applies from the end of May. What is being done in response to the data scandal is not a response to the data scandal. The new privacy tools have probably been in the planning for a long time. The possibility to download your own data is a legal requirement. So there has to be more if Facebook really has an interest in winning back the trust of many users. A lot more. How much is journalists’ fear of Facebook flowing into their coverage of Facebook?

Facebook is one of the players on the net, which has been posing enormous challenges for our profession for years – partly because Facebook is increasingly in control of what content is delivered to our audience via the newsfeed and which is not. Therefore, there was recently heated discussion about the announcement of Facebook to show more content from family and friends in the newsfeed – and less of pages. This is a bit of personal empiricism, but: In my mind, the biggest gaps in my breathing are the people who have long been annoyed with Facebook – and suddenly saw the reason why it was best to quickly delete their own performances and do it Cuddling up again on your own website and waiting for users to come by. That first changes in the algorithm may have been in use since autumn 2017 has been lost. That even Facebook does not know what the changes will look like in the end, all the more.

How much does the fear of Facebook play a role in reporting on Facebook to a number of journalists? It is so important that we discuss all these things. But it is just as important that we discuss the right points. That we attack Facebook in the right place and stay with the facts. Because the list of errors on Facebook is huge: For years, Facebook has ignored the concerns of privacy advocates and allowed applications to access the data of friends of their users. The fact that this gate was open so long, Facebook did not consider it necessary to control app developers sharply, and in retrospect satisfied with written confirmations that sold data was supposedly deleted, was grossly negligent. Facebook still does not provide enough transparency. To me many warnings are too small, and the language in my translation from English is often too schwurbelig. But you have to take advantage of Facebook, and the hint to check the settings has been displayed for quite some time again at the top of the newsfeed. The fact is also: Many users are too lazy to take care of it.

So far, Facebook has no serious interest in informing its users consistently and honestly. It would be very easy to send an e-mail to the affected 50 million people, so that there is finally a sense of security, whether you yourself were affected or not. Full-page ads in daily newspapers with embarrassing translation mistakes are not enough in my opinion. And it would also be fair not to give the impression that you are reacting to criticism, even though you are reintroducing functions that were planned anyway. Facebook still has a communication problem. Public reactions come too late, statements to the press are too rare, and contact persons for simple users are completely missing. A company of this size and with so much money in the backpack could act quite differently.

Facebook does not recognize its responsibility. The strategy of doing it first and everywhere and only then to see what happens is outdated; It would make sense to think about the effects of individual measures and functions in the future. It would also be responsible to make the profiles of new users as safe and private as possible by default – and to leave it up to the people themselves whether they want more publicity. It is still believed that government has to regulate Facebook, it is no longer a normal company, but has in recent years become an elementary component of our communication and information society – with more than 30 million users in Germany. At the same time, the company is moving too slowly, and the dangers that social networks bring to our democracy, among other things, have recently become clear. But for the debate, it is important that we stick to the facts and not fall into legend and populism – because that’s not how we get against this highly complex issue.

SEARCH

Top-right-side-AD-min
WHY US?

Calculate Your Order




Standard price

$310

SAVE ON YOUR FIRST ORDER!

$263.5

YOU MAY ALSO LIKE

Pop-up Message