Professionalism/Data Ownership
In Correspondence with STS 4600 at the University of Virginia.
Introduction
editPersonal data refers to information that relates to an identified or identifiable individual. Items that identify the individual could vary from their user account, their personal identifiers such as their name and address, or can be objects such as their IP address or cookie identifier.[1] While almost any kind of data can be collected on a user, data collection is often protected and limited. Louise Matsakis, a technology editor from Wired, writes that “health records, social security numbers, and banking details make up the most sensitive information stored online. Social media posts, location data, and search-engine queries may also be revealing but are also typically monetized in a way that, say, your credit card number is not.”[2]
"Data ownership refers to both the possession of and responsibility for information. Ownership implies power as well as control. The control of information includes not just the ability to access, create, modify, package, derive benefit from, sell or remove data, but also the right to assign these access privileges to others."[3]
Data is quickly becoming one of the most valuable resources on Earth.[4] Because data is sourced by people, this introduces a host of ethical dilemmas regarding the ownership of personal data.
People often unwittingly agree to terms that consent to the collection and sale of their personal data. These terms are generally presented in the form of fine-print terms and conditions statements or unassuming notifications from mobile apps asking to access information from your device. Once a data broker obtains a person's data, they generally do not allow the person any control over it; this raises ethical questions, especially if this data is sensitive and in cases where this data is breached.[5]
Legislation
editIn the United States, there is no general consumer privacy law at the federal level. Industry-focused laws exist, such as the Health Insurance Portability and Accountability Act (HIPAA) and the finance-related Gramm-Leach-Bliley Act (GLBA), but consumer data collected on the internet is largely left unregulated.[6] California and Virginia are the only US states with any legislation requiring data brokers to provide people the option to opt-out of allowing the sale of their data.[7]
California
editIn California, a bill called the California Consumer Privacy Act (CCPA) gives consumers rights over their collected data. The bill was signed into law on June 28, 2018, and gives Californians the rights to know what personal data is being collected and to whom it is being sold or disclosed. It also gives them the rights to delete their data and opt-out of the sale of their personal information, and provides protection from discrimination for exercising these rights.[8]
Virginia
editVirginia enacted the Consumer Data Protection Act (CDPA) on March 2, 2021, becoming the second state to enact data privacy legislation. Similar to the CCPA, it allows consumers to access their data, delete their data, and determine who possesses their data. It also gives consumers the right to opt-out of the processing and sale of their data. The CDPA further requires companies to implement "reasonable" data security practices to protect consumer data, and limits the collection and use of this data only to what is "reasonably necessary." However, it differs from the CCPA in that it also allows consumers to correct errors in their personal data. [9]
Other States
editSeveral other states have proposed similar data privacy bills, including New York, Maryland, Massachusetts, Hawaii, and North Dakota. These bills grant rights and protections similar to the CCPA and CDPA; however, they have yet to be passed.
European Union
editThe EU has been improving privacy policy since 1995 when it implemented the Data Protection Directive.[10] More recently, it enacted the General Data Protection Regulation (GDPR) in May 2018 to implement protection and privacy with regard to personal data.[11] It addresses transparency, constrains data usage and collection only to what is necessary, calls for reasonable security measures, and ensures that personal data can be corrected for accuracy, among numerous other principles. It also includes a limited version of the right to be forgotten, known as the right of erasure; this right allows individuals to request their data be removed from search engines and other databases. However, this right can only be exercised if one of several possible conditions are met. [12] There are two important court cases surrounding this right.
Court Cases
editIn 2014 the Google Spain case set precedent for the right to be forgotten in the EU. The case arose when a Spanish man complained about an old newspaper article selling his repossessed property. The man felt that because his debts had since been resolved, it was unfair that this search result still appeared.[13] The EU court ruled that the info should be deleted. Some claim deleting information, as seen in this example, is a form of censorship. A similar court case was heard between Google and a French privacy regulator in 2019 about the jurisdiction of the right to be forgotten. The EU court ruled that search results do not have to be deleted outside of the EU. Google implements this using its Geoblocking tool, which restricts access to search results based on the location that the search was performed. For instance, while 'forgotten' search results would be hidden within the EU, since the U.S. does not legally grant the right to be forgotten, the search result would still appear in the United States.[14]
Country Comparison
editThere is no single approach to data ownership and data privacy. Though the U.S. does not have a nationwide consumer privacy law analogous to the European Union's GDPR, individual states have started to take action, as seen in California and Virginia.
Some see this as a step in the right direction, but issues remain. For example, in California, one can only opt-out of data sale, but not data collection. Thus, companies can still gather and leverage personal data, unless a specific request for deletion is submitted. Furthermore, since only two states have enacted data privacy laws, personal data is left largely unregulated in the United States. While some companies have decided to roll out nationwide changes in response to the laws in California and Virginia, others have not.[15]
China has a privacy law similar to the EU’s GDPR called the Personal Information Security Specification that was enacted in March 2018.[16] However, there is tradeoff between privacy and surveillance. It is difficult to both maintain government access to citizen’s information while also protecting citizens from data usage by other parties.[17]
Handling issues of data ownership is not straightforward as there are many uncertainties and tradeoffs.
Data as a Commodity
editGeneral
editIn today's digital age, data has irreplaceable value. The largest consumers of user data include Google and Facebook, followed by Amazon, Apple, and Microsoft.[18] These big tech companies use data for obvious things such as tailored advertisements and understanding consumer behavior, but the primary application is providing data for input to artificial intelligence algorithms.[19] When data is such a valuable commodity, the question of whether data producers should be paid for their contribution arises.
In tech, what users give away for free is transformed into a precious commodity. It powers today’s most profitable companies. But the consumers it extracts data from often know little about the extent to which their information is being collected, who looks at it, and how much it is worth. In exchange for free use of their products, such as Google, Facebook, Youtube, users are paying with their data.
Examples of How Data Became so Valuable
editPopular music/video streaming platforms such as YouTube, Netflix, Spotify rely on the data they collect about a users interests on their platform. Netflix states, "our business is a subscription service model that offers personalized recommendations, to help you find shows and movies of interest to you. To do this we have created a proprietary, complex recommendations system." However your collected data can get complex. In addition to knowing what you have watched on Netflix, Netflix will also best personalize recommendations by looking at the time of day you are watching, the devices you are using, the duration of your watching, how other Netflix members with similar tastes use Netflix, information about the titles, such as their genre, director, actors, release year, etc. [20] This is a common trend among streaming platforms, to collect as much data about users' usage habits and keep them as a customer for longer.
Another common trend with the collection of user data, is the then selling of user data to third parties, done with user consent. However, often times users are not totally informed on what they are truly consenting to. 23andMe is a genetics startup from San Francisco. As of 2018, they had 5 million customers who sent samples of their spit to be analyzed to identify genetic changes at 700,000 different locations in their genomes. 23andMe does as promised, and delivers their analyzations and findings about your DNA. However, in order to use their services, users have to give consent for 23andMe's use of their data in medical research, which is not apart of the product that the user will ever see. "Giving consent by checking the appropriate box below means that you agree to let 23andMe researchers use your Genetic & Self-Reported Information for 23andMe Research."[21] So in 2018, a London based drug giant, GlaxoSmithKline, partnered with 23andMe to develop new medicines. Part of the deal entailed GlaxoSmithKline making a $300 million investment in 23AndMe. [22]
These two examples provide detail on how your personal data is the true commodity that companies are seeking. From a business ethics perspective, companies need to find more effective ways to inform customers about the types of data being collected on them. Similarly, customers need to be better educated about how valuable their personal data truly is, and how it is largely generating money for the companies. In both of the above examples, it is important to note that the user is not necessarily directly harmed through their data collection. However, there is a professional ethical dilemma with not being totally transparent about how personal data is used. Even further, while this personal data is being collected, a data breach/attack can put sensitive information on users under threat. Later in this chapter, professional ethics perspectives on data ownership will be explored.
Data Brokers
editData brokers are entities that collect and sell data. Since the terrorist attacks of 9/11, there has been a high demand for highly accurate identification of individuals through data and data brokers such as LexisNexis, Axciom, and Experian have been able to fill this demand through the collection of highly personal information from millions of people.[23]
The ways in which data brokers collect data are numerous and include public records, internet scraping, as well as getting people to opt-in to their data collection schemes through terms and conditions statements.[24] Some of the data collected could be subject to federal law, but the lack of regulation of data brokers in the US allows data brokers to buy and sell the information anyway.[25]
Once a data broker has someone's information, it is nearly impossible for them to regain control of their data. As mentioned previously, some states have begun legislation to fix this problem
Data Privacy and Security
editData Protection
editThe possession of large quantities of consumer data have become increasingly valuable to those who know how to use it. For companies, big data can be processed and analyzed to lead smarter business moves, promote more efficient operations, and yield higher profits from satisfied customers. [26] However, when placed in the wrong hands from cyberattacks, sensitive data can be used for malicious behavior. The management of data privacy protection has become increasingly complex due to the ubiquity of the information-intensive environment and the multidirectional demand of stakeholders and clients.[27] Companies dealing with the use and distribution of personal data must implement plans to ensure the implementation and compliancy of data privacy policies, standards, guidelines and processes.[28] A systematic approach used in navigating data security is understanding what kind of data the company has, tracking how company data is stored and transferred, and conducting regularly scheduled risk assessments. Some fear that the over-complexity of current data protection practices may expose vulnerabilities and weaknesses as data spreads across more platforms, both on-premise and in the cloud. [29]
Cyberattacks and Data Breaches
editData breaches occur from cyberattacks, or the unauthorized access of a computer system or network, and involve the leak of confidential/sensitive company data.[30] A data breach can have expensive short-term impacts on companies in all industries, costing business on average $3.86 million per breach and $148 per lost or stolen record.[31] On top of that, companies are obligated to perform forensic investigations to assess what information was stolen and where the vulnerability in the data security infrastructure.[32] Long term effects include the loss of customer trust as the company's reputation diminishes from the breach.[32] Studies found that 85% of customers won't shop at businesses that have data security concerns and 69% of customers would avoid a company that suffered a data breach.
Case Studies
editData Collection Freedom for Data Brokers
editThe lack of legislation regulating data brokers means that there is little to dictate how data brokers go about collecting data. Some data is free to collect, such as public webpages or public records. Other, often times more personal, data is not however, and so data brokers have developed processes to getting users to unwittingly agree to terms.
An example of this is X-Mode, which is a data broker that collects location data from people's smartphones through software embedded into mobile apps.[33] when a user downloads an app running X-Mode's software, the app prompts the user to allow it to use the device's location data without any further information. Many users will allow the app to use the data because they want to be able to in-app features. However, what is unknown to the user is that their location is now being tracked, and that data sold by X-Mode. This lack of transparency in the X-Mode process can be contrasted with that of 23andMe. Where X-Mode's approach doesn't informs user about their data usage, 23andMe tells used exactly who wants to use their data, and how and why they want to use it.
Facebook and Cambridge Analytica Scandal
editIn 2014, about 270,000 users were paid to take a personality survey through an app which scraped their Facebook profiles. They consented to the collection given it was for academic use by Cambridge University’s Psychometrics Center. Aleksander Kogan, a professor at the University, was hired by Cambridge Analytica to create this app.[34] The app used Facebook's Open Graph platform, which at the time, also gave access to all of the participants' friends' information. The acquired data included names, birthdays, likes, and location information, all of which were both sensitive and valuable information.[35] In this way, approximately 87 million Facebook users' data was obtained by Cambridge Analytica, 99.7% of whom did not consent.
Within the next year Facebook learned that the data was being used by Cambridge Analytica to aid Ted Cruz's presidential campaign.[36] They removed Dr. Kogan's app from the site and demanded that Cambridge Analytica delete the data.[37] According to Facebook, Cambridge Analytica confirmed that the data was deleted. However, in 2016 the company was hired by the Trump 2016 Presidential Campaign to provide tools for identifying personalities of American voters and targeting advertisements to influence their behavior.[38]
The scandal exploded in 2018 when whistleblower Christopher Wylie exposed Cambridge Analytica’s misuse of Facebook data, including that it was not in fact deleted. In the words of Mark Zuckerburg when he testified to Congress: “When we heard back from Cambridge Analytica that they had told us that they weren’t using the data and deleted it, we considered it a closed case. In retrospect, that was clearly a mistake.”[39] Facebook also suspended Cambridge Analytica from the site at this time.
Interestingly, Facebook claimed this was not a data breach. Facebook VP and Deputy General Counsel Paul Grewal stated in 2018: "The claim that this is a data breach is completely false... People knowingly provided their information, no systems were infiltrated, and no passwords or sensitive pieces of information were stolen or hacked."[40] Facebook routinely allows researchers to collect user data for academic purposes, as Dr. Kogan's app did. However, Kogan broke the rules when he sold the data to Cambridge Analytica, a commercial third party.
Christopher Wylie
editAccording to Wylie, as a Cambridge Analytica employee heavily involved in the project, he had witnessed their "corruption and moral disregard" firsthand. [41]. He admitted that the project's massive scale was exciting at first: "We had done it. We had reconstructed tens of millions of Americans inside of a computer... I was proud that we had created something so powerful." However, after realizing the unethical nature in which the data was obtained and seeing how it was being applied, Wylie broke free from this state of acclimatization. He said "the office culture seemed to be clouding my judgment" and that it was easy to "lose sight of what I was actually involved in" while simply "staring at a screen." After experiencing this wake-up-call, he came forward as a whistleblower.
Lessons
editChristopher Wylie demonstrated the obligation of the professional to speak out against unethical practices. As an ex-employee, Wylie faced immense pressure when speaking out, but was ultimately motivated by professional responsibility and duty to bring these issues to light. Wylie's experiences demonstrate that it can be easy to fall victim to acclimatization with regard to malpractice in the workplace; however, by taking a step back and considering the impact on consumers' lives, newfound clarity can be achieved. Though the massive amount of personal data initially showed promise for advancing psychological profiling technology, Wylie eventually realized that the unethical procurement and usage of the data was too great a cost (see Data Ownership vs. Data Processing). The end does not always justify the means, in cases of data ownership and professional ethics as a whole.
For more information, see Facebook-Cambridge Analytica data scandal.
Ethical Implications and Further Work
editConsent
editIn any study that uses participants to gather data, the participants must give informed consent. For a participant to give informed consent, they must be informed of all relevant information including how the data will be used, they must understand that information, they must participate voluntarily, and they must have the capacity to make a decision about whether to participate.[42] The EU's GDPR is the most comprehensive regulator of online data in that whenever a company collects personal data from a citizen, it requires explicit and informed consent by that person in the form of opting in to the data collection.[43]
However, the case studies presented above exemplify that this practice is nowhere near ubiquitous. Data collection policies hidden inside terms and conditions and apps that do not disclose the true uses of collected data do not satisfy the requirements for informed consent. The US and the rest of the world would need stronger regulation on data harvesting in order to protect the individual rights of online users.
Data Ownership vs. Data Processing
editShould companies focus on using data to advance technologies or should consumer data be kept private to avoid misuse?
Accessibility in large amounts of data has allowed us to improve on pre-existing technologies and develop new technological fields like machine learning (ML). However, the availability of all this information can lead to selfish and ill-intentioned actions. While Facebook collected data from their users for their efforts of improving their consumer's experience on the platform, their data was eventually misused. The Cambridge Analytica Scandal, referenced above, is a good example of how data was shared without consent and how this data was then used to influence the 2016 election. However, there's no reason that data like this could not be used for benevolent reasons.
Companies have banks of their users personal data. Their is a professional expectation that the company use the data in an ethical manner and transparently. However, as in the Facebook example, the data was thought to be terminated, but instead it lived on and was eventually used to manipulate political campaigns without the users' knowledge.
Individual Rights vs. The Common Good
editAt what point does serving the common good start limiting personal liberties? Data-driven solutions rely on utilizing as much data as possible to improve results, but what if obtaining the data is not easily accessible? Similarly, what if obtaining the data is puts the consumer's data at risk?
A good example of this is medical data. Medical data collected could produce advancements in science and medicine, but how is that balanced with an individual’s right to privacy of personal information? We’ve seen this recently surrounding the COVID-19 crisis. Contract-tracing is tracking who has a disease and who those people have been near. This is a crucial tool to limit the spread of an outbreak. The system uses your phone's Bluetooth to anonymously track who you have been in close proximity to, as long as they also use the system. It is built off of data collection. At the start of the COVID-19 pandemic, Google and Apple worked together to add coronavirus tracing to Android and iOS, the two most important mobile operating systems.[44]
Recently revealed in April 2021, the Android version exposure notification app had a privacy flaw that let other preinstalled apps potentially see the sensitive data that is if someone had been in contact with a person who tested positive for COVID-19. Google immediately worked on rolling out a fix to this bug, however, had the vulnerability never been found, members of the contact tracing app are left vulnerable. This contradicts the promise of contact tracing apps to keep data anonymized and secured. Serge Egelman, the CTO of AppCensus, which reported the vulnerability to Google, stated that "the lesson here is that getting privacy right is really hard, vulnerabilities will always be discovered in systems, but that it’s in everyone’s interest to work together to remediate these issues." [45]
Transparency vs. Accessibility
editShould companies be able to sell user data collected without being transparent about it's collection?
In the section above, this is discussed in relationship to the company X-Mode. Companies are able to keep cost of their services low or free by selling user data (location data in the case of X-Mode) without really being clear about doing so. Should users have a choice in this? Such as, shoulda user be able to choose to pay for an app rather than sharing location or other data?
Such as in the previously mentioned 23andMe case, users are forced to consent to the sell of their saliva samples/data for medical research. But, has 23andMe considered the selling their product as is, but allowing the user to opt into the medical research program by choice. If so, perhaps their wouldn't be enough users opting into the program, and thus 23andMe is losing revenue.
At the end of the day there are good arguments for handling data ownership problems a variety of different ways. These questions are important to consider, especially as new legislation begins to arise around data ownership and privacy.
Further Work
editThe subject of data ownership is broad and covers an extensive list of ethical dilemmas and relevant case studies. This chapter can also include ethical assessments on how personal data should be properly used in tech fields of blockchain, artificial intelligence (AI), and machine learning (ML). A topic to explore is also the lifespan of personal data, such as how and when it should be properly disposed of, and do users know how long their data is being used for?
References
edit- ↑ Information Commissioners Office (ICO). (2021, Jan. 1).What Is Personal Data? ICO. May 11, 2021, https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/key-definitions/what-is-personal-data/.
- ↑ Matsakis, L. (2019, Feb. 15). The WIRED Guide to Your Personal Data (and Who Is Using It). Wired. May 11, 2021, https://www.wired.com/story/wired-guide-personal-data-collection/.
- ↑ U. (n.d.). Office of Research Integrity. Retrieved April 29, 2020, from https://ori.hhs.gov/education/products/n_illinois_u/datamanagement/dotopic.html
- ↑ The world's most valuable resource is no longer oil, but data. (n.d.). Retrieved April 29, 2020, from https://www.economist.com/leaders/2017/05/06/the-worlds-most-valuable-resource-is-no-longer-oil-but-data
- ↑ Pasternack, A. (2019, May 28). Here are the data brokers quietly buying and selling your personal information. Retrieved April 29, 2020, from https://www.fastcompany.com/90310803/here-are-the-data-brokers-quietly-buying-and-selling-your-personal-information
- ↑ Green, A. (2021, April 02). Complete Guide to Privacy Laws in the US. Retrieved May 10, 2021, from https://www.varonis.com/blog/us-privacy-laws/
- ↑ Whittaker, Z. (2020, January 02). Here's where Californians can stop companies selling their data. Retrieved April 29, 2020, from https://techcrunch.com/2020/01/02/california-privacy-opt-out-data/
- ↑ AB-375 Privacy: personal information: businesses. (2018, June 29). Retrieved April 29, 2020, from http://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201720180AB375
- ↑ Rippy, S. (2021, March 03). Virginia passes the Consumer Data Protection Act. Retrieved May 10, 2021, from https://iapp.org/news/a/virginia-passes-the-consumer-data-protection-act/
- ↑ The History of the General Data Protection Regulation. (2017, March 29). Retrieved April 29, 2020, from https://edps.europa.eu/data-protection/data-protection/legislation/history-general-data-protection-regulation_en
- ↑ General Data Protection Regulation (GDPR) Compliance Guidelines. (2020). Retrieved April 29, 2020, from https://gdpr.eu/
- ↑ Intersoft Consulting. (n.d.). Right to erasure ('right to be forgotten'). Retrieved May 11, 2021, from https://gdpr-info.eu/art-17-gdpr/
- ↑ Cellan-Jones, R. (2014, May 13). EU court backs 'right to be forgotten' in Google case. Retrieved May 3, 2020, from https://www.bbc.com/news/world-europe-27388289
- ↑ Kelion, L. (2019, September 24). Google wins landmark right to be forgotten case. Retrieved May 3, 2020, from https://www.bbc.com/news/technology-49808208
- ↑ Morrison, S. (2019, December 30). California's new privacy law, explained. Retrieved April 29, 2020, from https://www.vox.com/recode/2019/12/30/21030754/ccpa-2020-california-privacy-law-rights-explained
- ↑ Sheng, W. (2020, March 16). One year after GDPR, China strengthens personal data regulations, welcoming dedicated law · TechNode. Retrieved April 29, 2020, from https://technode.com/2019/06/19/china-data-protections-law/
- ↑ Feng, E. (2020, January 5). In China, A New Call To Protect Data Privacy. Retrieved May 3, 2020, from https://www.npr.org/2020/01/05/793014617/in-china-a-new-call-to-protect-data-privacy
- ↑ Porter, Eduardo (2018). Your Data Is Crucial to a Robotic Age. Shouldn't You Be Paid for It? The New York Times. Retrieved 30 Apr 2020.
- ↑ Ayala, Manuel (2018). Should tech companies pay us for our data? World Economic Forum. Retrieved 30 Apr 2020.
- ↑ Netflix. (n.d.) How Netflix's Recommendations System Works. Netflix. May 11, 2021, https://help.netflix.com/en/node/100639.
- ↑ 23andMe. (n.d.). Research Consent Document. 23andMe. May 11, 2021, https://www.23andme.com/about/consent/#:~:text=Giving%20consent%20by%20checking%20the,23andMe%20Research%2C%20as%20described%20above.&text=Information%22%20refers%20to%3A-,Your%20genetic%20data,the%2023andMe%20Research%20logo%20or.
- ↑ Herper, M. (2018, July 25). 23andMe Gets $300 Million Boost From GlaxoSmithKline To Develop New Drugs. Forbes. May 11, 2021, https://www.forbes.com/sites/matthewherper/2018/07/25/23andme-gets-300-million-boost-from-glaxo-to-develop-new-drugs/?sh=5bf6ba733213.
- ↑ Data Brokers: Background and Industry Overview. (2007, May 03). Retrieved April 29, 2020, from https://www.everycrsreport.com/reports/RS22137.html
- ↑ Data Brokers: Background and Industry Overview. (2007, May 03). Retrieved April 29, 2020, from https://www.everycrsreport.com/reports/RS22137.html
- ↑ Data Brokers: Background and Industry Overview. (2007, May 03). Retrieved April 29, 2020, from https://www.everycrsreport.com/reports/RS22137.html
- ↑ Moore, G. (2017, September 9). Why big data is important to your business. https://insidebigdata.com/2017/09/09/big-data-important-business/.
- ↑ Lee, W. W., Zankl, W., & Chang, H. (2016). An Ethical Approach to Data Privacy Protection. ISACA, 6(2016). https://doi.org/https://www.isaca.org/resources/isaca-journal/issues/2016/volume-6/an-ethical-approach-to-data-privacy-protection
- ↑ neatly.io. (2020, February 13). The Complexities Of Data Security & Compliance. Neatly. https://neatly.io/navigating-complexities-data-security-compliance/.
- ↑ Coggins, J. (2020, September 8). Why Complexity is the Biggest Enemy of Data Security. Lepide Blog: A Guide to IT Security, Compliance and IT Operations. https://www.lepide.com/blog/why-complexity-is-the-biggest-enemy-of-data-security/.
- ↑ Ferguson, K., Beaver, K., & Hanna, K. T. (2019, May 17). What is a Data Breach? SearchSecurity. https://searchsecurity.techtarget.com/definition/data-breach.
- ↑ The High Cost of Security Breaches. PNC Insights. (2019, June 26). https://www.pnc.com/insights/corporate-institutional/gain-market-insight/the-high-cost-of-security-breaches.html.
- ↑ a b WORLDPAY EDITORIAL TEAM. (2019, July 10). How the Consequences of a Data Breach Threaten Small Businesses - Insights: Worldpay from FIS. FIS Global. https://www.fisglobal.com/en/insights/merchant-solutions-worldpay/article/how-the-consequences-of-a-data-breach-threaten-small-businesses.
- ↑ App Publishers: X-Mode. (2020, April 23). Retrieved May 29, 2020, from https://xmode.io/app-publishers/
- ↑ Kozlowska, Hanna (2018). The Cambridge Analytica scandal affected nearly 40 million more people than we thought. Retrieved 30 Apr 2020.
- ↑ Granville, Kevin (2018). Facebook and Cambridge Analytica: What You Need to Know as Fallout Widens. The New York Times. Retrieved 30 Apr 2020.
- ↑ Meredith, S. (2018). Facebook-Cambridge Analytica: A timeline of the data hijacking scandal. Retrieved May 11, 2021, from https://www.cnbc.com/2018/04/10/facebook-cambridge-analytica-a-timeline-of-the-data-hijacking-scandal.html
- ↑ Granville, Kevin (2018). Facebook and Cambridge Analytica: What You Need to Know as Fallout Widens. The New York Times. Retrieved 30 Apr 2020.
- ↑ Rosenburg, M., Confessore, N., & Cadwalladre, C. (2018). How Trump Consultants Exploited the Facebook Data of Millions. The New York Times. Retrieved 30 Apr 2020.
- ↑ Watson, Chloe (2018). The key moments from Mark Zuckerberg's testimony to Congress. The Guardian. Retrieved 30 Apr 2020.
- ↑ Grewal, Paul (2018). Suspending Cambridge Analytica and SCL Group From Facebook. Facebook. Retrieved 30 Apr 2020.
- ↑ Wylie, C. (2019, October 4). How I Helped Hack Democracy. Retrieved May 11, 2021, from https://nymag.com/intelligencer/2019/10/book-excerpt-mindf-ck-by-christopher-wylie.html
- ↑ Garcia, Christine (2018). Everything You Need to Know About Informed Consent. Retrieved 30 Apr 2020.
- ↑ Garcia, Christine (2018). Everything You Need to Know About Informed Consent. Retrieved 30 Apr 2020.
- ↑ O'Neill, P. H. (2020, April 28). How Apple and Google are tackling their covid privacy problem. Retrieved May 3, 2020, from https://www.technologyreview.com/2020/04/14/999472/how-apple-and-google-are-tackling-their-covid-privacy-problem/
- ↑ Wetsman, N. (2021, April 27). Android Bug Exposed COVID-19 Contact Tracing Logs to Preinstalled Apps. The Verge. May 11, 2021, https://www.theverge.com/2021/4/27/22405425/android-google-contact-tracing-bug-privacy.