Fjeldstad, Configuring value for competitive advantage: on chains, shops, and networks.Strategic Management Journal, 1998. Take, for example, the role of Facebook in the US 2016 election where Russia is accused of backing over 3000 ads, many of which violated US federal law. Technologies for tracking user behavior across IT domains, beyond company borders, has thus become important for these companies. On the one hand, I will comment on the ethical implications of public-private data sharing (such as the NSA retrieving data from Google) so common in the post-9/11 world. As Aral Balkan, ethical designer and founder of Small Technology Foundation, puts it: Not invented in 1984, but we know where they got the idea. 14. The concept of surveillance capitalism, as described by Shoshana Zuboff, arose as advertising companies, led by Google's AdWords, saw the possibilities of using personal data to target consumers more precisely. Echo chambers are dangerous –  we must try to break free of our online bubbles. Similarly, we know that Facebook has a vast number of third-party app-providers supporting users with cool features in the social media platform. Free UK p&p over £15, online orders only. In new book, Business School professor emerita says surveillance capitalism undermines autonomy — and democracy On internet privacy, be very afraid ‘Surveillance is the business model of the internet,’ Berkman and Belfer fellow says Isaak and Hanna [12]are among the many that calls for stronger regulations after the Cambridge Analytica scandal where Facebook handed over information about 87 million users, and which influenced the 2016 US election. As a result, surveillance capitalists now wield a uniquely 21st century quality of power, … The answer is that surveillance capitalism threatens an aspect of our freedom so basic that we are not used to defending it. Does the competitive use of data and analytics lead to ethical problems? This is important, for it points to a gap in Shoshana Zuboff’s immensely (and rightly) influential recent book, “The Age of Surveillance Capitalism.” For Zuboff, the motor driving mass surveillance is capital accumulation. To evaluate the ethical aspects of surveillance capitalism we can turn to the navigation wheel developed by Kvalnes and Øverenget [5]. In the mobile digital world we live in now, the customer generated content can even be health data typed in and generated by health apps on our mobile devices or smart watches. We have also illustrated that individual users have experienced moral dissonance for a while, reflected by the Privacy Paradox, but that they are continuing to use the digital services anyway (moral neutralization?). And so on. The Facebook / Cambridge Analytica scandal has launched surveillance capitalism into the global consciousness. If a company does the right things right, it may be possible to operate within the boundaries of the legal GDPR framework, even in a surveillance capitalism … In surveillance capitalism, the traded good is personal data. The US Congress reacted strongly on the news, and Facebook CEO Mark Zuckerberg had to face congress to explain the situation [13]. If this type of analysis and use of personal data is not made explicit to the user, and the user has consented (or there is another legal right to do so), then this type of data treatment will be illegal according to GDPR. Then it is legal, right? At the individual level, on the other hand, the situation is different. Few understands the positive and negative aspects of surveillance capitalism to a full degree. Borcea-Pfitzmann et al [7] and Hansen et al [8] have presented interesting ideas about the amount and types of digital information being collected about individuals throughout one´s digital life. Telematics and Informatics, 2017. Think about our everyday digitally connected life where we use smart phone, smart watches, apps, PCs, visiting web pages, participates in social networks etc. The algorithms, powered by our personal data, that predicts what we like and dislike, tend to show us biased information. Governments are currently building knowledge on how to tackle threats towards democracy, and I believe that is a driving force for change, and stricter regulation. Surveillance capitalism describes a market driven process where the commodity for sale is your personal data, and the capture and production of this data relies on mass surveillance … We are only in the beginning of the digital era. We have centered the discussions around the fact that the traded good in surveillance capitalism is personal data. In Table 1 I listed definitions related to personal data. On a generic level, on the other hand, we can look at some of the most known companies within this space: Google, Facebook, Twitter, Amazon. Anonymization is the process of removing any information that show which particular person something relates to[4]. Even the prediction results are considered personal data. How do we measure goodwill? We have argued that governments are at the beginning of understanding the consequences, and that they have increasing concerns that surveillance capitalism may influence democracy and citizens´ human rights to privacy. The fact that anonymized data can be re-identified, and the question of whether we as users have competence to give an informed consent or not are topics for legal discussions. The tricky part is to balance all viewpoints listed in this chapter, and all the ethical aspects discovered by using the Navigation Wheel. One would expect that goodwill can be measured in terms of changes to user base, or influence on the stock market. Users and commercial companies are being connected through targeted advertisements (driven by AI), etc. Mitchell [1], Sherman [2] and Chen et al. 11. Stabell, C.B. Computerworld, 2009. Nation states are currently building capabilities (inspired by the US election 2016 and the Brexit process) to influence democratic processes. Players within the system have the power to assign value to the goods and sell them to the market. Surveillance capitalism is a novel economic mutation bred from the clandestine coupling of the vast powers of the digital with the radical indifference and intrinsic narcissism of the financial capitalism and its neoliberal vision that have dominated commerce for at least three decades, especially in the Anglo economies. Barth, S. and M.D.T. Big data provides remarkable benefits, they claim, so users choose these benefits over the risks they are somewhat aware of, but do not understand. Advanced IT services has been made available to both personal and professional users through modern cloud technologies. This presentation will be accessible to a general audience, and it will be useful for anyone who uses the products and services offered by the largest surveillance corporations: Google, Facebook, Amazon, and Microsoft. Recent research is starting to show that the original concepts of privacy protection through anonymization is being challenged. Still, from my professional experience there is one common pitfall, and one true and new ethical dilemma: Pitfall: Creating truly anonymized datasets requires professional engineers/data scientists. These are complex topics. Her claim, and the dark twist, is that the companies collect data without the users´ awareness. In my opinion we as society is currently experiencing moral dissonance. Miller, R. The Top 100 Best-Performing Companies In The World, 2019. What privacy do we have left, when we know that the amounts and types of data being traded in the surveillance capitalism context exceeds the single source “like” data type by magnitudes? Thoughts about Surveillance Capitalism and Human Futures Marketing. First Monday, 2012. In such a world, a product or feature stays offline until the team has a resolution to ethical dilemmas—a sort of Hippocratic Oath for upholding ethical practice. And what if the user has consented to the data treatment? Romm, T. and D. Harwell. Do we understand the amounts of data being collected? Only the future will show whether we accept and neutralize the situation as a society, or not. 11(3): p. 119-128. The customer generated data is interesting to analyze in itself. Monday, 11 January 2021. We have started to see, but not yet fully understood all aspects of the surveillance economies and data driven communities. If a user has given consent to data processing and data analysis, does he/she actually have competence to give an informed consent? The evaluation of whether it is illegal or not is an extremely complex evaluation that must be carried out on a case by case basis by professional legal personnel. In my discussion on legal aspects, I have also touched upon moral questions. E-commerce and market intelligence are among them. Rocher, L., J.M. The Privacy Paradox is the situation where people claim to be concerned about their online privacy but do very little to protect their personal data – they continue using digital services. There is no need to repeat the arguments in the direction of not justifying current practice. Is an ordinary user capable of imagining that the mere collection of meta-data (likes) and analysis (through machine learning) of those data is sufficient to generate a valid and accurate psychological profile of a person? “Surveillance capitalism is a further evolution of capitalism that follows in the old pattern of taking things that live outside the market, subordinating them to the market dynamics as commodities that can be sold and purchased, but with a dark twist.” Zuboff says in the interview. de Montjoye, Estimating the success of re-identifications in incomplete datasets using generative models. Deep learning, cloud computing, processors like GPUs, and compute powerrequired to train neural networks faster — technology that’s become a cornerstone of major tech companies — fuel today’s revival. GDPR does not regulate truly anonymized data sets. Mitchell, R., What Google Knows About You. 16. de Jong, The privacy paradox – Investigating discrepancies between expressed privacy concerns and actual online behavior – A systematic literature review. 34(7): p. 1038-1058. 2019  [cited 2019 22 September]; Available from: https://marketingland.com/facebook-lost-15-million-users-marketers-remain-unfazed-258164. Overall, it is easy to find good arguments that justifies current practices. In these cases, the trading of data will be illegal according to GDPR, however, often without the knowledge of the companies taking part in the trade. • The Age of Surveillance Capitalism is published by Profile (£10.99). Does the patient understand the consequences of his/her choices. Chiang, and V.C. Our discussion so far has been on the critical side, looking at the possible negative sides of surveillance economy or user data driven economies. Surveillance Capitalism Date: Thursday ... and in the use of indigenous ethical models as a framework for creating a more humane and equitable internet. Example 2: Or imagine that a company collects user logs and user activity on a network to search for anomalies and signs of security breaches. What is one of the main purposes of collecting, processing and analyzing personal data and behavioral data? Just as the Billionaire class keep gaining more power, so too do the global corporations that are best positioned in the emerging realm of Surveillance Capitalism. It is closer to the process of de-identification and pseudonymization, which we can see from Table 1 is regulated by GDPR. Capitalism is an economic system that thrives by bringing a product or service into the marketplace. Storey, BUSINESS INTELLIGENCE AND ANALYTICS: FROM BIG DATA TO BIG IMPACT. In data driven economies, there is not only one single data set for sale. During the last decade they have provided services that are world class and free. Such fines should make even the wealthiest companies consider and be cautious about finding loopholes in the legal frameworks (loophole ethics). And if it is not sufficiently informed then it is illegal, or…? 2015: Palgrave MacMillan. In economics we talk about market power. 19(5): p. 413-437. 121: p. 562-567. We do not understand the amounts of data that is being gathered and how it is being used. 19. Let us get back to the original question of this chapter – Is it legal? Would the company’s innovation rate and eager to attract new and retain old customers be equally high without the existing competition to attract customers to the free service platforms? Based on my experience, and being above average interested in reading privacy policies, I can confirm that many of the big IT service companies describes the use of personal data for analytical purposes, and that they reserve the right to use them for analysis and transfer data to third parties. If a company does the right things right, it may be possible to operate within the boundaries of the legal GDPR framework, even in a surveillance capitalism setting. Smith, D. Zuckerberg put on back foot as House grills Facebook CEO over user tracking. To keep us satisfied and make us stay within the platforms´ own eco-system we are presented with information and news that the algorithm predicts that we like. The US Congress has recently launched initiative to understand how social media can be weaponized [14]. Chen, H., R.H.L. and the was amount of data being generated, collected, stored and analyzed 24/7/365. Each company has unique values. With this in mind, it is hard to see that the massive collection, use and trade of personal data is in conflict with their values – it is their core. 2019  [cited 2019 22 September]; Available from: https://www.washingtonpost.com/technology/2019/09/18/facebook-google-twitter-face-fresh-heat-congress-harmful-online-content/. Procedia Computer Science, 2017. Remember that users leaving the platform is bad for business in the surveillance capitalism paradigm. Ethical dilemma: The surveillance capitalism is facilitated by the technological revolution we have already mentioned: extreme data processing power, low cost storage and abundance of data which allows for big data analysis powered by machine learning and AI. After the Cambridge Analytica scandal Facebook settled with the US government, and paid a fine of 5 billion dollars [15], which is a substantial amount of money. How can we make informed decision in democratic processes if we are seldom exposed to alternative views and arguments to our own? 15. Clicks, comments, transactions, and physical movements are being increasingly recorded and analyzed by Big Data processors who use this information to trace the sentiment and activities of markets and voters. Facebook has reportedly lost 15 million users during the last two years, according to Amy Gesenhues [16]. Think about the research by Youyou and colleagues [6] that I mentioned earlier: by analyzing one single data source (Facebook likes), they were able to predict our psychological profile, political attitudes, physical health and more. Borcea-Pfitzmann, K., et al., What user-controlled identity management should learn from communities.Information Security Technical Report, 2006. The European Commision[2] gives the following definition of personal data: Consequently, all data within the scope of surveillance capitalism is defined as personal data, including user generated content and meta-data. Hansen, M., A. Pfitzmann, and S. Steinbrecher, Identity management throughout one's whole life. I back up this claim by the facts that Shoshana Zuboff has illuminated us with her book on surveillance capitalism (2019), the EUs reaction by introducing GDPR (2018), and the quite recent US congressional hearings after the Cambridge Analytica incidents [12] [13] (2018), and the fact the US congress is starting to investigate the power of the large social media platform companies [14] (2019). And is it morally right to trade anonymized data, when we know that data from different data sources can be run through an AI engine and re-identify users with high degree of certainty? 13. The more users and the more data a company has, the more the company and the data set is worth. 4. But if they give consent, is it a truly informed decision? Gesenhues, A. Facebook lost 15 million users? However, even with GDPR there are ethical dilemmas: how do we interpret the privacy principle of purpose limitation in a data driven environment powered by advanced machine learning and artificial intelligence? Observing that surveillance capitalism undermines freedom by transforming “human experience” into “behavioral data,” Zuboff argues that it is unethical for corporations, such as Google and Facebook, to collect personal data for financial gain. Thus, I will make the provoking claim that in the digital space, most users are incompetent to give informed consent. Similar to meta-data, customer generated data can also be analyzed to predict behavior and generate user profiles. This is far from a saturated list of data types; however, it is interesting to observe the characteristics of these. Users are simply switching social media platforms. in Gmail), to a strategy where they could combine data, build user profiles and place ads that the users were even more likely interested in. We discussed different ethical aspects related to surveillance capitalism in the previous chapter. Personally, I cannot imagine going to a new city without Google maps by my side. Personvern-smekk for Facebook: Må betale 42,7 milliarder. 6. Shoshana Zuboff has explored a phenomenon she calls Surveillance Capitalism. Isaak, J. and M.J. Hanna, User Data Privacy: Facebook, Cambridge Analytica, and Privacy Protection. In this paper we have looked at ethical aspects related to the concept Shoshanna Zuboff calls Surveillance Capitalism. There is no doubt these companies, and their free services creates value to society. From Article 9 in GDPR, we know that there are specially strict rules when it comes to special categories of personal data, including «racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation». What if the data sets being traded and used to build the prediction models are anonymized data? Is it not only fair that they generate profit on our data when we get to use them for free? #gdpr #ethics #datascience #cybersecurity #dataprotection #technology #artificialintelligence, https://www.forbes.com/sites/lensherman/2018/04/16/why-facebook-will-never-change-its-business-model/#5946b93564a7, https://ceoworld.biz/2019/06/28/the-top-100-best-performing-companies-in-the-world-2019/, https://www.theguardian.com/technology/2018/apr/11/zuckerberg-hearing-facebook-tracking-questions-house-back-foot, https://www.washingtonpost.com/technology/2019/09/18/facebook-google-twitter-face-fresh-heat-congress-harmful-online-content/, https://e24.no/naeringsliv/i/6jywlo/personvern-smekk-for-facebook-maa-betale-427-milliarder, https://marketingland.com/facebook-lost-15-million-users-marketers-remain-unfazed-258164, https://www.theguardian.com/science/blog/2017/dec/04/echo-chambers-are-dangerous-we-must-try-to-break-free-of-our-online-bubbles, https://www.youtube.com/watch?v=XyQyZgqiokE, The privacy principle of purpose limitation. As already discussed, Facebook has been subject to scrutiny by the US Congress, and the news about the Cambridge Analytica scandal have been extensively covered by public press. 13(2): p. 83-94. Further, Google has made major steps forward in the AI research and made advanced tools and algorithms open for private and professional users, free for use. The Privacy Paradox is a phenomenon described, but still not fully understood by researchers. These are data defined as special category data within GDPR, and especially important to protect to ensure our legal and human right to privacy. Most companies have well defined terms of use and cookie policy descriptions. From a legal and ethical perspective, can we really answer yes to these questions in the new digital era? As Zuboff claims, it is to be able to predict future user behavior or intentions. A Computerworld article from 2009 by Robert Mitchell [1] describes how Google made changes to their algorithms to be better at targeting advertisements towards their users. Audience members will be more informed about the various moral meanings of private-sector surveillance and about their own connection to such practices. Companies like Facebook, Google and the like may do business within the boundaries of the legal framework, but there is a need to evaluate what informed consent means. However, we still also have to ask the moral question: is it right? We are analog people living in a digital world. Focus 2019. Like the railroad barons who took advantage of farmers anxious to get their crop to market in the 1800s, tec… The abundance of data and low-cost processing power and storage has influenced the field of Big Data Analytics. Companies interacting in the surveillance capitalism domain are huge multinationals, often based in the US or China. However, there is one big and important question remaining: Do we as users give an informed consent? Hendrickx, and Y.-A. 1. Examples of moral dissonance has been observed in user groups by the research community the last decade. Then trading is legal within the scope of GDPR. Toward an Ethical Data Future The impact of “Surveillance Capitalism” and this new data economy will be tremendous, but the outlook for the future is still up in the air. Seemingly, it is for free, but behind the scenes there is a mutual agreement to share user data between Facebook and the third-party supplier. “Our results suggest that even heavily sampled anonymized datasets are unlikely to satisfy the modern standards for anonymization set forth by GDPR and seriously challenge the technical and legal adequacy of the de-identification release-and-forget model», they claim. But what if multiple data sets are purchased by a company that uses AI to re-identify data based on 15 data points…. GDPR Recital 32 further details the requirements for consent: “Consent should be given by a clear affirmative act establishing a freely given, specific, informed and unambiguous indication of the data subject's agreement to the processing of personal data relating to him or her, such as by a written statement, including by electronic means, or an oral statement.». Even though I am sceptic, I am a user of free services from Google. Would we be willing to pay for the services, and how much would we be willing to pay? The fourth industrial revolution is happening alongside historic income inequality and the new Gilded Age. But the evidence, and examples that have come to governments´ attention points in a direction of stricter regulation – and points in a disfavor of moral neutralization at a societal level of status quo. Progressive campaigners and activists (and yes the platforms that serve them) should reflect on how they are complicit, what steps should be taken to reduce the harm to their supporters, and continue to lead their movements by example. In his book Kvalnes [5] describes the concept of moral neutralization, which is a process that can take place after initial moral dissonance. We as users do have a real choice; we can read and accept the policies, or we can reject them – at the cost of not being able to access and use the service. For each service provider we use, whether it is online shops or services, our health institution or at work, we generate digital identities with large amounts of personal data. Both Miller and Sherman explain why: The currency we as users pay with is our personal data, and as Zuboff states, this data is being sold in a new digital marketplace. [3] EU has the strongest privacy regulations in the world, throgh GDPR. We need to understand what types of data is being collected and processed, and under which circumstances, before we can go into a discussion of whether it is legal or not. Toward an Ethical Data Future The impact of “Surveillance Capitalism” and this new data economy will be tremendous, but the outlook for the future is still up in the air. 8. I am also a parent and see the value Facebook has for coordinating parents and school classes and sports teams. Even with a strict rule set as that defined in GDPR it is complex to evaluate what is legal or not. they were able to go from directing ads towards customers based on key words in the text they read (e.g. Høgseth, M.H. Since this privacy research [7] [8] in the mid-2000, the surveillance capitalism has emerged – in the exact opposite direction of user-controlled digital identities/data. What are our preferences? The Houston Astros appear to have used data unethically. Observing that surveillance capitalism undermines freedom by transforming “human experience” into “behavioral data,” Zuboff argues that it is unethical for corporations, such as Google and Facebook, to collect personal data for financial gain. Is the patient in a position to reason and assess options and alternatives? Surveillance capitalism’s “means of behavioral modification” at scale erodes democracy from within because, without autonomy in action and in thought, we have little capacity for the moral judgment and critical thinking necessary for a democratic society.