Living in a Connected World/Filter Bubbles and the Flow of Information

Introduction edit

 
A visual representation of a filter bubble.

You are now aware of your breathing

U

pon reading this remark one might wish they hadn’t done so for, in drawing attention to an unconscious, automatic process we are encouraged to acknowledge the purely machine-like essence of our own internal body. When we think about breathing we invariably disturb its normal rhythm; our thoughts slow the machine and make it inefficient; our lungs have gone rogue and the staggering number of muscles used in breathing or blinking are rebelling. When our attention is cast on them they ebb then falter. This machine isn’t contingent on our active engagement and yet it is fundamentally tied to us, our habits, our environments – it is, indelibly: us!

Whereas our internal machinery was manufactured over many millions of years of evolution, to whom or to what process do we owe the internal machinery of the internet which operates as persistently as a heartbeat or the flexing of the diaphragm?

As we will see, beneath the internet lies its own mathematical language which directs our online activities in ways we might not have anticipated and, in understanding the most basic of concepts that breathing is vital to life, so too will we reveal how the internet’s underlying digital apparatus is now inexorably tied to our entire online experience.

Manufacturing Consent edit

"The past few years have witnessed a rapid penetration of the internet by the leading newspapers and media conglomerates, all fearful of being outflanked by small pioneer users of the new technology, and willing (and able) to accept loses for years while testing out these new waters. Anxious to reduce these losses, however , and with advertisers leery of the value of spending in a medium characterised by excessive audience control and rapid surfing, the large media entrants into the internet have gravitated to making familiar compromises – more attention to selling goods, cutting back on news, and providing features immediately attractive to audiences and advertisers."
- Manufacturing Consent [1]

In Noam Chomsky’s and Edward S. Herman’s essential work on the engrained modus operandi of mass communications media, we find a critical basis from which to continue. Manufacturing Consent delivers to us a critique that must pervade all our subsequent findings should we wish to truly scrutinise the situation at hand. Although it is beyond our reach to add to the propaganda model proposed by Chomsky and Herman, it stands nonetheless not only as a focal point for reflection but as an aide in going forward. Indeed, what Chomsky and Herman demonstrated is still as salient today and there will be interesting contrasts and comparisons to be made.

A Propaganda Model: Five Filters edit

"The elements interact with and reinforce each other. The raw material of news must pass through successive filters, leaving only the cleansed residue fit to print. They fix the premise of discourse and interpretation, and the definition of what is newsworthy in the first place, and they explain the basis and operations of what amount to propaganda campaigns."
- Manufacturing Consent [2]
 
Eli Pariser author of 'The Filter Bubble' giving a TED Talks in 2011

The "Filter Bubble" was a term coined in a work of 2011 by Eli Pariser [3] . Before we expand on what precisely Pariser means, for our pursuit let us first suggest that Pariser’s filter bubble owes its primary conception to the Five Filters to which the “raw material of news must pass through”. In doing so, we will see how the invisible bubble that filters – and then constructs – our internet habits first emerged in the essential ingredients of a system-supportive, propaganda model. The Five Filters of this model are:

1) Size [4] ; (concentrated ownership, owner wealth, and profit orientation of the dominant mass media firms). The growth of a select few media giants has been astounding; in the US, the wealth and control is so concentrated that 6 media giants hold domain over 90% of what Americans choose to either read, watch or listen to. “Choose” is contradictory here for whatever one ‘chooses’ one invariably gets only the choice decreed from the media barons above.
2) Advertising [5]; (as the primary income source of the mass media). Shining an investigative spotlight on Social Media platforms we see that Facebook in short, makes its money from advertising. How is it that advertisements have become so ubiquitous? What is an editorial piece and what is a piece specially crafted for the purpose of selling a product, service, idea? The profit motive exists at every stage to which we have been inducted not as conscious actors but as mere consumers.
3) Sourcing [6] ; (The reliance of the media on information provided by government, business, and “experts” funded and approved by these primary sources and agents of power). How this has evolved with the now widespread popularity of technology that allows instantaneous video capture is of course remarkable but it would be too simple to characterise this as ‘progress’ and ignore the material forces at play.
4) Flak [7] ; (as a means of disciplining the media). ‘Bad press’ that is, the spread of unfavourable views of a media outlet encourages advertisers to withdraw patronage – a severe and costly punishment. This emboldens the advertisers with a great and powerful energy: they can either “make” or “break” you depending on how supple you are to their message.
5) Ideology [8] ; (Exactly meant is the ideology of anti-communism. This is the state religion of America). Communism as the ultimate evil has always been the spectre haunting property owners, as it threatened the very root of their class positions and superior status. If it threatens property interests call it Communism and mobilize the populace against the enemy! Today, there might not be some encroaching Red-Terror from the East but, how has the media-sanctioned ideology shaped our views with regards to terrorism, austerity measures, illegal immigration? Can a person even be illegal? Is what we do to them also terrorism?

Concurrent with time of writing, Al Jazeera English have released an animated short film giving an accessible, easy to understand account of the five filters model.

What the ‘filter bubble’ proposes is that what has passed through these five filters is in turn, re-filtered by the end user in a process that suggests a horrible circularity. This is a personalised process that only guarantees to return to the user what they already ‘like’, of course what they like to begin with is a contrivance manufactured by the dominant media interests to foster our role as a pliant consumers.

The Culture Industry: Give us again only what we had once before! edit

"The culture industry piously claims to be guided by its customers and to supply them with what they ask for. But while assiduously dismissing any thought of its own autonomy and proclaiming its victims its judges, it outdoes, in its veiled autocracy, all the excesses of autonomous art. The culture industry not so much adapts to the reactions of its customers as it counterfeits them. It drills them in their attitudes by behaving as if it were itself a customer. […] The culture industry is geared to mimetic regression, to the manipulation of repressed impulses to copy. Its method is to anticipate the spectator’s imitation of itself, so making it appear as if the agreement already exists which it intends to create. It can do so all the better because in a stabilized system it can indeed count on such agreement, having rather to reiterate it ritualistically than actually produce it. Its product is not stimulus at all, but a model for reactions to non-existent stimuli."
- Service to The Customer, Theodor Adorno [9]

What Adorno is talking about here is exactly the notion we're seeking to explore. This will become increasingly apparent when we delve into the systems which 'personalise' our experience; giving us "what we want whether we like it not!" This rings true most evidently in the examples of Netflix and Amazon where recommendation is translated as "because you watched..." and "people also bought..." We are the victims of a mechanism in these examples which we will subject to further analysis. These filtered products of the Culture Industry are so ubiquitous and yet (or, because thereof) their thoughtful exposition might conjure feelings of dread and anxiety. It is as it were our very character was under threat; so closely tied the relationship is between a concept of the self and our consumption habits. Indeed, In a neo-liberal system that emphasises 'the market' above all else, all that is required of the individual is to consume - that is the extent of our responsibility and nothing more. Should we "pop" the matrix of the filter bubble with the sharpened tools of our analytical set we might uncover a new route to a culture that doesn't diminish but expands the democratic potential of our technological systems. And perhaps, just as these filtering processes have been designed-in so too can they be designed-out and replaced with something better suited to our needs.

"Just as the factory farming system that produces and delivers our food shapes what we eat, the dynamics of our media shape what information we consume. [...] We are predisposed to respond to a pretty narrow set of stimuli - if a piece of news is about sex, power, gossip, violence, celebrity, or humour, we are likely to read it first. This is the content that most easily makes it into the filter bubble.
- The Filter Bubble, Eli Pariser [10]

The Power of Algorithms edit

This ties in to the idea of filter bubbles and the flow of information through the impact it has on organizations and consumers of said organizations. To set an example we will look at the impact that an algorithm would have on an online shopper. Every time the online shopper makes a purchase, or even so much as searches a product on a website, this information is recorded and held onto their account, so every time they use the account more and more data is added and a database on this person's shopping pattern is built. [11]

A real life example of this happening occurred in the US. A mathematical genius, Andrew Pole that worked on the algorithm system for the superstore franchise Target, was approached by collegues who asked him if he could use the algorithm system to figure out whether or not one of their customers was pregnant. The reason he was asked to carry out this somewhat bizarre task is because new parents always shop big in superstores like Target, as they have such a wide range of products, so figuring out when female customers were pregnant could end up being very money efficient.

Pole ended up taking up the challenge and in short; he was successful and the algorithm worked out that a customer was pregnant, before some of her immediate family even knew. (For the full article- [12]This just shows the power that an algorithm can possess when used correctly- it can go as far as working out personal information about you such as pregnancy, even determining what stage of pregnancy you are it. This is a very strong tool, which if it isn't blocked can reveal a lot about a person's character and could even be a security risk.

 
BP logo

Algorithm has a direct impact on creating a 'Filter bubble' through the ability algorithm has on recording a users search data. One interesting finding was a comparison between two people's search of the organisation BP (British Petroleum). One user who had searched it found several articles and webpages discussing the investment news related to BP whilst another user found several articles about an oil spill that had occurred. So despite searching the exact same thing the filter bubble of each individual impacts on what data actually comes up.

This can be looked at positively and negatively. As a positive; whatever you search is tailored towards you previous searches, in theory meaning you have to put in less effort to find what you are actually looking for. However, from a negative viewpoint, does this mean that one's search is restricted to what they have done in the past? Due to being in a filter bubble certain information will not get through, or it will take a long time to try and obtain. So in spite of the positive impact algorithm can have, it can be perceived as a tool that restricts the user from fresh information, and it does so without the user being aware of what is actually happening.

From the Five Filters to Five Data Points edit

"The basic code at the heart of the new internet is pretty simple. The new generation of internet filters looks at the things you seem to like – the actual things you’ve done, or the things people like you like – and tries to extrapolate. They are prediction engines, constantly creating and refining a theory of who you are and what you’ll do and want next. Together, these engines create a unique universe of information for each of us – what I’ve come to call the filter bubble – which fundamentally alters the way we encounter ideas and information."
- Filter Bubbles, Pariser [13]

For the purposes of illustration and to give us a conceptual notion of this new personalized Internet that extrapolates even from our most innocuous actions a "theory of you" that in turn, moderates our subsequent activity, let us draw a comparison with the Soap Bubble hypothesis of the multiverse. From the inside of the filter bubble looking out the world comes to imbue (in the words of Pariser) the qualities of the Ptolemaic universe where everything revolves around us. The Internet itself could be described as a multiverse of these individual 'bubbles' each home to a personalized set of rules or laws of nature which are the filtered offerings derived from our constructed data points. Our bubble in this multiverse of bubbles is full of the products filtered for our convenience: it's a "custom-tailored world". We can make it as self-lobotomizing as we wish; if we want to exclude uncomfortable items on reality TV or foreign wars, they're gone! If we don't follow links on the topics happening out with our national borders then they are deemed irrelevant and simply filtered out. This universe can be warm and welcoming, full of bright colors and soft fabrics where you need only occupy yourself with the things individual to you. Everything you see is a reflection of your interest and desires. You are never bored. You are never annoyed. You are never Challenged.

Your filter bubble is individual paradise; a digital North Korea where you are the Supreme Leader. [14]

"Five Data Points" - They do more surveillance than the NSA! edit

"[T]he top fifty Internet sites, from CNN to Yahoo to MSN, install an average of 64 data-laden cookies and personal tracking beacons each. Search for a word like 'depression' on Dictionary.com, and the site installs up to 223 tracking cookies and beacons on your computer so that other Web sites can target you with antidepressants."
- Filter Bubbles, Pariser [15]

In the above example, a search for "depression" would count as a data point and this search would then determine what you see next. Pariser notes that the company Hunch claims they can predict your consumer preferences correctly a staggering 80% of the time with only five such data points. More so, even without your five data points if 2 people you are connected with have disclosed five data points Hunch can still predict your future preferences accurately. What that means is that even if you are not giving data away about yourself, an interpretation of you can be formed from the data of your friends. However, doesn't this suggest a rather crude approximation of who a person is? Could five pieces of information really be so telling? Are we really so one-dimensional in our habits and in the company we keep? And, even if we are to what pleasure do we owe a company to trade as a commodity this information about ourselves? The predictions made on our consumer preferences might have come from the data points of friends concerning information they'd rather choose to not share personally but to which an algorithm has relieved them of control.

Proposed hypothetically, could we with an imagining of the cumulative effect of successively known data points arrive at a predetermined point based solely on this information? This could take the form of:

data point 1) You are a frequent visitor to the twitter feed of Israeli news outlet Haaretz.
data point 2) You made a Google search for "correct pronunciation of quinoa"
data point 3) You bought "Islam for Dummies" on Amazon.
data point 4) You have a friend on facebook who only posts about West Ham United.
data point 5) You use Firefox Web Browser, being most active between the hours of 18:00 and 22:00.

With 5 data points what can we deduce about this fictional 'you'? The question seems utterly futile but it is the infinitely branching possibilities offered by this data which is crunched by the algorithms into results which might prove relevant, only incidental or completely random. An attempt to mine backwards through the data is the inverse of the processes at play; vague connections arise from the data and not the other way around. We can postulate on the troubling issues that would surface should one seek to get the measure of another's actions by way of data collection alone. Edward Snowden brought many of these issues of light and spoke at TED2014:

"People should be able to pick up the phone and call their family, people should be able to send text messages to their loved ones. People should be able to buy a book online, they should be able to travel by train, they should be able to buy an airline ticket without wondering how these events are going to look to an agent of the government years in the future; how they’re going to be misinterpreted and how they’re going to think what your intentions were."
- Edward Snowden

While Snowden's revelations might seem frightening in the context of intelligence agency over-reach into our private lives, let us also point out that a private company such as Acxiom alone has accumulated an average of 1,500 pieces of data on each person in its database - which includes 96% of all Americans [16]. The signals we send out while online are commodities to be bought and sold. The 'individual' experience is the marketization of 'You' now realized as a product. What better testament to the success of a market-system where consumers are likewise products - and yet still profess their own exalted individualism! What's good for the customer is not what is good for the citizen and, if we should retain a disposition for democracy and seek to improve our material conditions, we should actively embrace the role of participating, conscious citizen.

The Private Sphere and Democracy edit

 
Examples of various social media platforms.

The mobility of the private sphere edit

The Private sphere has changed as a result of technology. Once resigning in family homes and secluded areas, the private sphere existed as something that was somewhat static. It was very difficult to communicate with a group of people who were not in the same space as you. However, due to advancements in technology and the natural development of Web 2.0, people can now communicate and partake in discourse through online social networks. The easy accessibility of these networks means that one can tweet their opinion on a political matter during a work break or participate in online polls while watching the news. As a result, “Social network sites extend the connecting and mobility of the private sphere by providing online spaces that host offline and online networks of social relations”. [17]This meaning that we can connect with friends, family, and acquaintances privately online through computers and mobile phones located in private domestic spaces.This new mobility certainly changes individual civic habits. However, do these online social networks within the private sphere actually enhance democracy?

The Personalisation of Newsfeeds edit

The most common form of online communication is through social media sites such as Facebook. As social media profiles are based on identity users can “like” or “follow” institutions, businesses, organisations, and entertainment profiles in order to stay up to date with their own personal interests as well as staying in touch with friends, family and acquaintances.

However, this becomes problematic when individuals only subscribe to profiles that hold the same opinions and beliefs as their own. More specifically, Facebook gives users the option to filter out the information they do not wish to see by allowing individuals to delete or mute friends, or pages whose political or ethical viewpoints do not coincide with their own.

Moreover, the platform itself also holds a responsibility in the filtering out of information. Much like the customised search results of Google, Facebook filters out information which they believe to not suit an individual’s personal beliefs based on what they have liked and what their friends have liked.

These filter bubbles become incredibly problematic when we consider that 51% [18] of people with online access use social media as a news source, with 44% [19]using Facebook as their top social network for news. A further 61% of millennials use Facebook as their primary source for news about politics and government. [20] Therefore, it is clear that Facebook is not just a space were we connect with friends, but a space in which many obtain their news. As a result, through the creation of filter bubbles both for us and by us we create personalised newsfeeds, which show us the news that we want to see, rather than news that we should be seeing. In turn, this may make us blind to the issues and struggles that face other communities that we are not apart of.

Problems with personalised newsfeeds edit

  • It segregates: Even though available to everyone, some argue that Facebook contains a built-in demographic bias as you need internet access and literacy skills in order to use the site. Moreover, its initial interface was structured around privileged educational institutions, hence the site carries an American bourgeois element [21]. Furthermore, “The networked nexus of the private sphere is vulnerable to the inequalities present in the public sphere” [22]Online social networks make the private sphere more mobile, however, they also replicate the pre-existing elites, inequalities and divides that exist within the public sphere. By separating communities within the private sphere, echo chambers are created whereby mutual opinions are shared and not debate. Thus, leaving the question, how will we progress if as a society we continue to be segregated into different social divides? If social and political issues, especially those affecting minorities are only fought by those in which they effect, then progress and solutions will prove harder to obtain.
  • The self becomes the reference point: Social networks help autonomy, control, and expression. However, “social networks of the private sphere present ego-centred needs and reflect practices structured around the self: This would present liberating practices for the user, but not necessarily democratising practices for the greater society”[23]. By only subscribing to information we wish to see, we become focused on our own self-interests, our civic habits shift from a focus on the greater good, to a focus on the greater good surrounding ourselves. The private sphere proves hard to be democratic when the opposing side of the discourse is consistently silenced through filter bubbles. Thus, although social networks have a social and political place within a democracy, they don’t necessarily make democracy better. They are democratic, but not democratising.

Solutions to personalised newsfeeds edit

  • Flip feed: This is the idea of Facebook installing a link which will allow the user to see an alternative newsfeed, one which is free from algorithms, and personalisation, thus showing the user a newsfeed which differs in opinion from their own.
  • Taking control of your online content: Instead of only “liking” the pages of news outlets that you agree with, try to create a balanced feed for yourself by “liking” multiple news outlets who offer a different opinion from your own. [24]
  • Review Options: Much like sites such as rotten tomatoes whereby users leave ratings on how good a film was, a review option on Facebook, may act as a tool for users to vote on how trustworthy a source is and thus may be a good way to avoid the growing spread of fake news on the site.[25]
  • Facebook accepting its role as a news source: This is quite a complicated solution. Facebook at current, has no desire to accept its role as a new source. Rather, it continues to focus on boosting ad serving volume and engagement rates. [26] However, if it did, it could take on the responsibility of a news outlet and present a more balanced display of information to individual newsfeeds. The problem here is, do we really want Facebook to have more power over what we do and do not see?

Citizen journalism and the flow of information edit

Our interaction with news sources has evidently changed; reading the news is no longer a passive activity as citizens can now choose whether or not to endorse stories. Peer-to-peer news sharing engines such as Digg and Reddit allow users to vote on news stories that they deem to be the most important –thus possibly upsetting the hierarchy of news making and pluralising news agendas. [27]. The flow of information has completely shifted, instead of news flowing in a one to many format, it now flows from many to many. There is no longer a top to bottom flow of information where news outlets and editors are at the top deciding what is newsworthy. Therefore, mainstream news agendas are being contradicted by readers acting as news gatekeepers.

 
BBC News, who introduced a UGC hub in 2011

The role of citizen as gatekeeper is also a result of more and more journalism online being created by “the people formerly known as the audience” [28]. Citizen journalism has taken form in news outlets requesting and using live video footage, photographs and first-hand accounts from members of the public, for example the BBC introduced a user-generated content hub (UGC) in 2005. However, it is in the form of blogging that offers considerable democratic potential as blogging allows media consumers the opportunity to become media producers. This new power to report allows citizens to present stories beyond mainstream media’s set agendas, thus encouraging the plurality of voices and the expansion of public agenda. Citizens also have more time to invest in investigative work due to a lack of deadline constraints, thus they can produce in-depth follow-up stories which many journalists do not have time to do. Many citizen journalists also hold professional knowledge in a specific field, hence they can explore and discuss certain stories in greater depth than that of a professional journalist. However, many of these blogs created by citizens begin to regress into self-confessional posts that resemble diaries. Thus, sparking a debate around whether citizen journalism is about comment, evaluation and subjectivity rather than the objectivity traditionally presented by professional journalism organisations and reporters. However, Lasch[29] argues that narcissistic behaviour is motivated by the desire to be connected to society, the personal reference in blogs adds a human touch to their reports. It also must be noted that citizens are allowed to express their opinions, unlike most mainstream media outlets. Narcissism in blogging allows citizens to self-reflect, to analyse, and, to simply “get a lot of things off their chest”[30] Put simply, narcissism in citizen journalism aligns with the core ideas of democracy.

Furthermore, instead of citizens taking control over what is newsworthy, a few popular blogs or persons may determine news agendas on social networking sites, hence proving counteractive as it replaces the traditional agenda setter with that of an inexperienced popular figure. Therefore, news agendas are still being set, but instead of by a qualified editor, or the masses, they are being set by the online popular elite –which may be equal to or worse than the traditional system. Moreover, the idea of being under qualified is a major argument against citizen journalists, with many holding the belief that the public need trained journalists who can interrogate, collect information and use contacts in a way that citizens simply cannot.

Nevertheless, by citizen journalists and professional journalists working together, the private and the public sphere overlap in order to make something democratising. Citizen journalism does not replace professional journalism but enhances it.

The creation of a new space edit

The private sphere does not so much oppose the public sphere but works somewhat collaboratively beside it. Through social networks, the private sphere can be accessed by two parties at opposite sides of the globe, completely reinventing the domain. Although the private sphere is mobile, it faces a considerable amount of issues regarding the flow and balance of information. Social networks provide users with more control, autonomy and expression. However, despite these being key elements in a democratic society, they do not necessarily enhance democracy, thus as Zizi Papacharissi argues: [31]

“Far from a recipe for democracy the private sphere is an attempt at new space and a new sociality”
—Zizi Parachrissi, A private sphere: Democracy in a digital age

Control over Flow of Information edit

Ramifications of Information Sharing edit

The information we share online has implications for us which reach further than simply our connections list on our social medias. If we, firstly, take the sharing of our personal information on social media we are enabled some form of control through privacy settings and decision making in relation to our interactions and friends lists which allows us to seemingly regulate which content of our own we share and with whom we share it. It is here where factors such as “impression management” and the forming of an online identity come into play which aid us when faced with these decisions. However, the flow of our information online reachers further than we imagine and there are implications which arise from the information we choose to share as it flows knowingly or unknowingly online in the creation of filter bubbles as well as in aiding surveillance.

The Role of Information in the Creation of Filter Bubbles edit

Google as a surveillance machine: surveillance of user’s data is vital to the way in which google operates, and under googles own policy guidelines they themselves can define how they process and utilise user data.Google works to create individualised filter bubbles whether a user is signed up or not, by recording information about our online activity specifically in cataloguing keywords. Google, by defining its privacy policies and user terms of service carry out a large-scale surveillance of users in order to create filter bubbles to flood a user with relevant advertising and thus for economic benefit. Issues of privacy over owns own personal content arise and in 2012 the EU data protection was proposed to protect the rights of users, it states that online users must consent to their data being used in order to create personalised filter bubbles, however again this consent only needs to be in the form of accepting users terms of service which many do mindlessly to begin with. [32]

And thus, though this can limit the creation of filter bubbles to some extent by reducing the information that google is able to store it would be near impossible to carry out completely.

Google as a Surveillance Machine edit

Google can only work as Google by surveilling user data in order to gather information about user interests and activities which will work to target advertising. Here, it is almost impossible for a user to maintain control over the content they share and still be able to engage with the internet. This applies to both passive users of Google, as well as those who decide to sign up for a google account.[33]. In signing up for a Gmail or Google+ account, google are able to monitor user activity, even collecting information about us from our friends. According to Google themselves, this is in an attempt to create a better and more individualised user experience on any google related site, there are issues over invasions of privacy as well the plain fact that the creation of filter bubbles allows targeted advertising and the commodification of users for Google’s financial benefit. In owning a Gmail account you are enabling Google to comb through everything we send and receive we are enabling Google to store our information in databases.[34].

Information Sharing on Social Medias edit

The NSA monitors our use of Google and Facebook.[35] Facebook can be considered as a threat to privacy if we take into the account the wealth of information that millions of facebook users upload and update about their own lives and the lives of their friends regularly online. The use of privacy settings on such social medias can give slight control and even more so an illusion of further control over the information we share online. Helen Nissenbaum argues that “the right to privacy is neither a right to secrecy nor a right to control but a right to appropriate flow of personal information.”[36] Utilisation of privacy settings on SNSs can allow a user, such as that as a Facebook user, to control the flow of the information that they allow online. However, these privacy settings are often not enough and are undermined by online surveillance done by corporations wishing to gain financial benefit from their users. Information gathering, as carried out by google and any site that uses cookies in order to give a “better experience” does so as a means which undermines privacy of a user unknowingly as well as gaining profit by creating filter bubbles which allow targeted advertising.

From Privacy to Surveillance edit

 
Surveillance cameras

Any person willing to engage in social media, any site powered by google or any site that functions with the uses of cookies and much more, will have information about them stored in databases, possibly monitored and even possibly used for surveillance. Privacy settings can provide some protection for a user who wishes to control the flow of their information to whomever they choose. However, these do not protect users from the investigatory powers of the corporations themselves, or in fact anyone who knows their way around these measures. Fuchs states that “In modern society, privacy is inherently linked to surveillance.” [37]. Our online visibility, for many, is something that we consider somewhat out of our own control if we consider especially what our friends and institutions may share about us online. However, this also goes deeper. The investigatory powers bill introduced in the UK [38] now also allows many credible organisations to view and comb through the search and internet history by forcing internet service providers to keep a log of sites visited by their users.Our information flows to companies often without us even realising and goes on to be stored, compromising privacy. Users are also unknowingly profiled with their stored data in the creation of filter bubbles, arguably an intrusion of privacy. There are ways around this such as deleting internet history or joining other services which promise an anonymous relationship, but for the vast majority of internet users their information already flows through the internet.

“In modern society, privacy is inherently linked to surveillance.”
—Christian Fuchs, Social Media: A Critical Introduction

[39]

Different Data Trails lead to Different Experiences edit

Our experiences on the Internet are different varying on who we are and our personal interests. Through the function of filter bubbles explained previously, a person's experience on the internet influenced by their own agendas, can vary when looked upon a different user who will use the internet in a different way for different things. Each person's likes and dislike are exclusively unique. When looking at Raymond Williams Cultural determinism views, it is conveyed that his work was very centred around the basis of new media and the way in which it was used as a device and how it was controlled. Williams believed the true potential of new media could only be exploited by social norms and ideals that were already present in a culture- and therefore traditional power relations would be sustained and many of the same things a social group is used to being presented will continue to be reproduced and exist. [40] These ideas are relevant when focusing on 'data trails' and how different individuals' data trails lead to them experiencing different things while on the web.

Williams believes technology is political and people's views are influenced by set algorithms on different web pages and power relations in a society. [41] This is a down side to the power of "new media" and account for why many people are only being able to view a selected section of a mass of information available to them because of people in higher power having control over what is available to them to view based on their interests. Facebook founder Mark Zuckerberg, said Facebook's main function was to manage impressions and develop relationships with other people the way you would as if you were in a face to face conversation.[42] However critiques are worried about the invasion of privacy Facebook undertakes without full explanation and hence how it goes from encouraging people to share "Whats on their mind" to their close relations to a more Web 2.0 view point where posts and likes made my users help satisfy several cooperatives who benefit from a vast information overload of peoples interests for economic satisfaction in the form of advertisements.[43]

Worries over privacy on the internet have been on going with the knowledge of data trails. As the power of the internet continuously grows stronger, many fear that an individual's control over what they post and who sees it is beginning to weaken. The so called protective framework that runs on different online platforms to ensure some sort of privacy has been dismissed. This is because, many corporate and government executives have highlighted several problems in it suggesting that the concept of 'privacy' is non-existent these days. The suggest that the wealth of knowledge online now available in this modern society and the vast amount of benefits they receive comes at a cost- that cost being their personal information and data trails being easily accessible to different organisations. There are debates over whether privacy is a "moral or a legal right". [44] Morally it seems wrong and invasive that many organisations can access your personal data trails to use at their own disposal. However legally this can seem to benefit the user themselves by providing them with a more enjoyable experience online as a result of different organisations filtering things they know the user will like.

Despite being on their own private platform, at the end of the day an individual is still operating on a public platform in a political and public sphere. Therefore, this results in a conflict between what's public and private because the overall content being produced by an individual is on such a public domain. For example interacting with people on your social media in public and posting comments on Youtube is ways users of the internet interact publicly and therefore (maybe without willingly knowing), expose themselves and their data trails to be looked at by different people and organisations giving them an overall different online experience. [45]

Advertising and Monetisation: Filter Bubbles in Consumer Culture edit

Founder of Facebook, Mark Zuckerberg, is worth roughly $60 billion, yet the site he created over a decade ago is free for its users. Likewise, Twitter founder, Jack Dorsey is worth nearly $2 billion, but it costs you nothing to send a tweet. This is possible through the marketisation and monetisation of the products these men have created, to them every move of the mouse is an opportunity to make money.

Advertising edit

Advertising revenue is one of the main ways Social Media makes its money. Facebook reported massive revenue jumps in Q1 of 2016 by selling mobile ads, pushing its total revenue to $5.5bn from $3bn. Mobile ads counted for four fifths of this growth[46]. The reason for this is that they sell for more than desktop ads and that more people are seeing them. The Facebook app has been downloaded by over 85% of the smartphones over worldwide, an estimated active mobile audience at around 1.6 Billion[47]. This is not including the entire eco-system built into Facebook’s messenger application which launched in 2015. In terms of revenue, there is no better social media business model than Zuckerberg’s whose just advertisement segment reached a value of US$ 13,539 Million by Q3 2018 excluding the revenue from other segments like payments and other fees[48]

 
Mark Zuckerberg - South by Southwest, 2008

.

That’s not to say its younger cousin, Dorsey’s Twitter, is a slow mover. In fact, Twitter’s Q2 earnings for 2016 hit $610 million. While this was less than expected for the company, they too seem to be embracing the shift from desktop to mobile, particularly when it comes to video sharing and playback, looking to create what they call ‘premium viewing environments’[49]. Since this statement towards the end of 2016 there has been a spike of video sharing on Twitter, which is usually accompanied by a pre-video ad, especially in the case of sports’ or TV highlights. Businesses also see Twitter as a space to share their content, and if done correctly, the audience do most of the work for them. An example of this could be the immensely popular trailer for Netflix’s Stranger Things’ second season which garnered 150k retweets, with little effort from the show’s Twitter account. Likewise, the famous selfie of assorted stars at the Academy Awards in 2015, which was a guerrilla ad for Samsung’s Galaxy Note, got almost 4 million retweets. Social media advertising matters and companies are very aware of that.

Tech upstarts Snapchat have also branched into the world of big business, by incorporating advertising into their Discover tab. Snapchat does not have a desktop component to worry about, so all of the premium advertising space that they have to offer comes in the form of those aforementioned, expensive mobile ads. A thirty second ‘Brand Story’ will run a company $750,000 whereas a space on the discover page will cost closer to $250,000[50]. They now play between your friends’ stories and companies will pay huge sums of money to appear first on the Discover tab. One such company who seems to have a permanent slot here is the Daily Mail, who appear first almost every day. The aim of these companies is to gain a subscriber base by offering some free content that leads you towards the premium content they offer. Two who do this very well are Sky Sports and ESPN, who give highlights of football and basketball not long after the games have finished in the vast majority of cases.

While there is a definite perceived shift from desktop to mobile, there is not necessarily a need to kill off desktop advertising, or even cross platform advertising. This is something companies still use to their advantage frequently, sometimes without you even knowing.

Cookies edit

In short, apart from the delicious biscuits, a cookie can be defined as: A small text file (up to 4KB) created by a website that is stored in the user's computer either temporarily for that session only or permanently on the hard disk (persistent cookie). Cookies provide a way for the website to recognize you and keep track of your preferences.[51] Naturally, this is big business for the websites trying to sell advertising space, as they can tailor ads directly to the consumer, and as has been mentioned in previous sections of the Wikibook, created an online profile for users. This allows the likes of Facebook and Twitter to more effectively market the space on their desktop sites especially, and create individual ‘filter bubbles’ for specific users but instead of filling the bubble with content, it is filled with advertising.

Following a revision of the UK’s laws surrounding online consent, websites are now required to make it clear that they are using them and that by interacting with the site you are implying your consent to having them build a virtual profile of you[52]. This has been cleverly dealt with by the vast majority of websites, storing extra information with regards to ‘cookie policy’ away behind a few more clicks. Much like the terms and conditions of almost every online form, these are easily brushed over by users who would rather access a site’s content than wade through the quagmire of complex, legal jargon.

 
Cookies hidden away in Firefox 3.0

While there are arguments against cookies to be made from the stand points of online privacy and opportunistic advertising, from a flow of information point of view, they are a helpful resource. For example, while the idea of expanding our filter bubble may make sense when it comes to incorporating different points of view, or new things we have not tried before, from an advertising point of view that makes almost no sense. If we take Facebook as an example, it’s billion or so users are not going to all be interested in the same things, in fact there’s a huge chance that their only common interest may well be Facebook. Cookie software means these people do not see the same ads, but instead see a tailor-made set of adverts, specific to their likes and dislikes. If the advertising is going to exist, and it will, then there is little point in seeing things we are likely to have no interest in.

Premium Filter Bubbles edit

Another way that companies have found to utilise filter bubbles is through the production of premium content for users. Often these will have a free component to them, for example Spotify or YouTube, but will hide the best content or experience behind a pay-wall. News outlets have been doing this with varying degrees of success for a great number of years, fairly safe in the knowledge that the content they create is premium enough to warrant a fee, as it had done in print format hundreds of years before the internet even existed. For free sites, it was a leap in the dark to branch out into the world of premium content, but in some cases it has worked exceptionally.

YouTube Red edit

Here we will look at YouTube’s ‘YouTube Red’ premium program as a case study. Launching in 2014, YouTube Red is a premium service for YouTube users which exists within the free YouTube ecosystem. It allows users to watch YouTube videos ad free, save them for offline viewing, and creates original content in partnership with already successful YouTubers such as Casey Neistat. The platform, much like Netflix, has integrated these original series with a catalogue of films and TV shows available to view on demand. If we remember Twitter’s comments regarding premium video streaming content at the end of 2016, we can see just how ahead of the curve YouTube was, realising how to grow their already successful platform almost two years prior.

 
YouTube Red Logo

YouTube Red allows YouTube, or parent-company Google, to take the idea of filter bubbles to an even higher level than can be achieved by any free site. By paying their monthly subscription, people are already giving a deeper insight into their likes and dislikes than people accessing free content. There are millions of videos on YouTube, many of which aren’t worth the time it takes to load them, but you can’t complain if you paid nothing to watch grainy, mobile-phone footage of a concert or sporting event. YouTube Red puts more onus on the content creators to deliver to a paying audience. This gives a clear idea of what content users will be interested in, and therefore allows YouTube to commission more content in that mould.


Control Over What We See edit

Reality is a scarce resource[53]. An idea that the media uses the news to fabricate a reality for us.

Who is in Control? edit

To what extent are we, as users, in control over what we see online? Who is it that regulates content that we consume and connect with every day? Well it is fair to suggest that there is a lot of control over what we view online. The internet shapes our opinions of events, ideas and issues which are displayed and reported online and can be far from objective. Media platforms pick and choose what to show to the consumer. Media power can effectively control, to an extent, the minds of readers or viewer[54]. We give into an endless stream of suggested content that is specific to us that we become completely bias in our opinions and what we use on the internet. Not just social media but people's opinions create filter bubbles for us to influence our own.

Filter Bubbles edit

Filter bubbles are constantly controlling what we view online. We like pages that have bias opinions and are selective in their production of content to fit this bias. That is why we like them because they fit our own opinions and we like being reassured that someone else feels the same as we do. Filter bubbles the produce more of this bias content that fits our ideal and we are surrounded by it. It effectively isolates people in their own ideological reality. We become part of an opinion that expands and grows the power we give in to it. Advertisements are another way in which we are effectively distracted from mainstream content along with Google's algorithm which is specifically tailored to produce content that is specific to the individual.

The issue of filter bubbles has been recognised as a growing concern by internet giants such as Bill Gates and Mark Zuckerberg. In a recent interview, Gates points out that “Technology such as social media lets you go off with like-minded people, so you’re not mixing and sharing and understanding other points of view”. Zuckerberg (the CEO of Facebook) has mentioned filter bubbles as one of the most discussed concerns of the year in his 2017 manifesto ‘Building Global Community’. In this manifesto he highlights the risk of filter bubbles causing “sensationalism and polarization leading to a loss of common understanding” and makes it a goal to create an informed, inclusive community that will help people see a “more complete picture.”

Filter bubbles are not specific to social media and the internet but can be due to public opinion through lobbying, protesting and through activism.

'Existing research has suggested several means by which social media can influence collective action, such as providing mobilizing information and news not available in other media, facilitating the coordination of demonstrations, allowing users to join political causes, and creating opportunities to exchange opinions with other people'[55] 
'Consumption of hard news, particularly newspaper and online news, has been found to be a consistent predictor of various forms of political participation'[56]

The Trump Campaign edit

During election time social media users were bombarded with video after video on Facebook and sometimes other platforms, each one highlighting a new flaw in Donald Trump’s campaign which included promises made and not kept, his derogative language, his lies and past career and painting him as a misogynist which worked to Hilary Clinton, his opponent’s, advantage. The internet played a huge role in showing the election to the world and provided a platform for discussion and opinion. However objective news reporting was cast aside and in many instances, newspapers had even turned away from the rule of objectivity, with writers using what Howard Portnoy, in his article describes as “snark”[57] to report on the campaign. Especially in well established newspapers like The New York Times, bias reporting can cause people to change their own views to suit what they believe to be the general consensus.

 
Donald Trump

Facebook and Media Outlets edit

Facebook played a huge role in shaping people’s opinions on Donald Trump. So much so that people overlooked the many positive and negative of opposing candidate Hilary Clinton. Even though Trump was portrayed as ruthless and cruel he was victorious in his rise to Presidency. An example of how the internet forms our opinions and is partly to blame for its outcome. This idea is especially prevalent in countries that were outside of the US whose only insight into the election was through the internet. People were therefore held at the will of the internet which held all power in controlling people’s thoughts and opinions on the election. There is no doubt that the footage shown was real however media outlets such as Facebook twisted it and manipulated things with captions and clips cut together to give very strong messages to encourage people agree with them.


Alternatively, a lot of people blamed the press for Donald Trump's victory. This is a valid argument as people were bombarded with hundreds of news stories of Trump during the election making him better known among voters.

' No one wakes up on June 17 and randomly decides on their own that Donald Trump should be the Republican nominee for president. People’s minds change because they are hearing information that they haven’t heard before'[58]

It was this non-stop coverage that arguably boosted Trump into the lead. Even those who were not voting, from other counties etc could not get away from the news articles. It was Barack Obama that had used the internet greatly during his campaign when he ran for President and no one had seen this before. Donald Trump did the same during his campaign and it was obviously working to his advantage, therefore you could say that internet has much to account for in this sense.

Symbolic Power and News Stories edit

The news nowadays seems to exploit the notion of ‘power to the people’, often considered as positive, letting society voice their opinions, however this has been proven to go both ways for those under scrutiny.

'Everyone in a democracy is a certified media critic'

- Michael Schudson, News and Symbolic Power[59]

This supports the idea that the news and the media have right to give out this power to the public and that symbolic power should not be only given to journalists. It could be argued, however, that the media hold a lot of these powers themselves in that they pick and choose their content. If individuals can use their social media for entertainment and personal then, as Gil de Zúñiga, Jung, & Valenzuela put it: 'there is no reason to think that people who are motivated to follow public affairs will not use their accounts in, say, Facebook or Google Plus to consume hard news and public-oriented information'[60]

Meikle argues that symbolic power is used to those who have greater Symbolic Capital:

‘not everyone is able to exercise this power in the same kinds of way or with the same kinds of success’[61]

So does that mean that the more powerful you are, the more power you have over what content is created? It seems to be widely accepted that the biggest news platforms are considered the most reliable so it is those people in that power have the greatest influence over the consumers. It could be argued that a news article is not the event itself but a story of an event through the someone else's word choice, someone else's eyes. This highlights the idea that it is the journalists and content creators that decide what society receives and how they receive it. These stories are what people accept as truth and do not seem to question to what extent or realise that what they read or watch online is the truth written or produced by someone else and that person or group of people are the ones decided how you receive it.

Outside the Screen edit

Of course media and the internet have control over what we see, their influence doesn't just affect our lives on the internet but our lives in general. It controls what we see outside of our screens, controls how we look at things and our opinions. Media affects everything and we participate, involve and surround ourselves with it. It changes the outcomes of events and shapes us as people. News is too important to be left only to journalists[62]

Filter Bubbles And Data Mining edit

Anything and everything we look at online is monitored. Vast and powerful programs lurk behind the scenes, keeping track of everything we say and do on social media, shopping sites like Amazon, video hosting sites like YouTube, and even search engines. These algorithms use our past experiences to create advertisements we might be interested in based on terms searched, comments posted, links followed. Certain social sites even employ an algorithm that shows content pages similar to what we have seen in the past. This construction of patterns based on past data is called Data Mining, and it is used all over the net. Contrary to popular belief, data mining is not the extraction of data itself, but the extrapolation of strings of data in order to create patterns that relate to the person whose data is being mined. Advertising and most of the content on the rest of this page have much to do with data mining, and the creation of filter bubbles merely speeds up the extrapolation process. By only exposing oneself to certain pages and products on social media or online in general, the mining algorithms are able to feed the advertising algorithms a stream of data about what they think you’ll like to see next. “Personalization and customization of web content by the user is not a new trend, and has always presented an alluring, yet not an inherently democratizing, aspect of web use. It does, however, allow users to select and promote a personalized news feed.” [63]. Such personalization is all because of data mining and the trails of data every internet user leaves behind.

Filter Bubbles And Data Trails edit

 
Google predictive searches are not just the most popular searches, but also a result of data trails.

A filter bubble is generally defined as "the space created when algorithms dictate the content a user sees." Instead of getting the uninhibited experience, the user is shown content similar to what they have seen before. The way the content shown is decided upon is based on the user's data trails, which as stated earlier on this page are tracked by massively powerful algorithms. A very good example of how data trails effect what people see online is a story told by Eli Parisher. He asked two of his friends to Google "Egypt," and to inform him of the results. The two men took pictures of their search results and discovered that despite searching for the same term their results were very different. One man was shown story after story of the Arab Spring, while the other got no results about that at all. Mr. Parisher went on to say that a filter bubble "is your own personal, unique universe of information that you live in online. And what's in your filter bubble depends on who you are, and it depends on what you do. But the thing is that you don't decide what gets in. And more importantly, you don't actually see what gets edited out."[64]

Data Mining And Advertising edit

Data-trail-tracking algorithms are used not just for adjusting our Google search results, but for adjusting the sorts of advertisements we see around the net every day. These algorithms can alter the advertisements we are shown online even as we browse, causing just a few clicks to completely change what we see. As the Cookies section above states, a vast amount of websites use very small text files to keep track of our movements online and tailor our advertising based on what we prefer to look at. While the effects of Data-Mined advertising can be averted by deleting the cookies from a hard drive, as soon as the user logs back onto the internet they will begin leaving another data trail. This makes it very easy for algorithms to create and maintain Filter Bubbles through the use of Data Mining.

Glossary edit

  • Algorithm: a process or a set of rules to be followed in calculations or other problem-solving situations. These tend to be very complex formula so often require a computer system to work them out.
  • Citizen journalism: News stories and information being collected, analysed, posted and written by members of the general public, most commonly by the means of the internet.
  • Civic Habits: The duties or actions that people in a country, city, town, or local area part-take in, in order to better the community.
  • Data trails: An electronic record of the transactions or activities of a particular person, organization, etc. [65]
  • Data Mining: The process by which algorithms extract knowledge and patterns from data trails and use them to create a profile of a person based on their digital footprint.
  • Echo Chamber: An echo chamber is where ideas, beliefs and opinions are consistently reinforced, repeated and "echoed" within a designated space.
  • Filter bubble: "A filter bubble is a result of a personalised search in which a website algorithm selectively guesses what information a user would like to see based on information about the user (such as location, past click behaviour and search history) and as a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles" [66]
  • Gatekeeper: A gatekeeper is someone who controls what information gets through to the public.
  • Lobbying:Attempts by individuals or private interest groups to influence government decision making.
  • Newsfeed: In terms of social media, a newsfeed is the central interface where users post and share infortmation, photos, videos, ideas and news stories with each other.
  • New Media: Means of mass communication using digital technologies such as the Internet. [67]
  • Power Dynamics: The more powerful, the more persuasive. Those who hold power in the media especially have a great deal of power over those who consume it.
  • Protesting: Media provides a source of information and opinion which creates platform for users to discuss and participate in. The content displayed by those with opinion creates a basis for argument.
  • The Private Sphere: The private sphere is the complement or opposite to the public sphere. The private sphere is a certain sector of societal life in which an individual enjoys a degree of authority, unhampered by interventions from governmental or other institutions. Examples of the private sphere are family and home. [68]
  • Social networks: a group of websites or applications which allow users to communicate with each other through posting information, comments, messages and images, etc
  • "Snark": Slang Phrase using a combination of the words snide and remark.
  • User-generated content: Any form of content such as blogs, wikis, discussion forums, posts, chats, tweets, podcasts, digital images, video, audio files, advertisements and other forms of media that is created by users of an online system or service, often made available via social media websites. User-generated_content
  • Web 2.0: The second stage of development of the Internet, where there was a shift from static web pages to interactive, user-generated content accompanied by the growth of social media.

References edit

  1. Chomsky, Noam & Herman, S., Edward, Manufacturing Consent: The Political Economy of The Mass Media - with a new introduction by the authors [Pantheon Books, New York 1988] (Introduction p.xvi)
  2. Chomsky, Noam & Herman, S., Edward, Manufacturing Consent: The Political Economy of The Mass Media - with a new introduction by the authors [Pantheon Books, New York 1988] (p.2)
  3. Pariser, Eli, The Filter Bubble: What the Internet is Hiding From You [Penguin Books, New York 2011]
  4. Chomsky, Noam & Herman, S., Edward, Manufacturing Consent: The Political Economy of The Mass Media - with a new introduction by the authors [Pantheon Books, New York 1988](pp.3-14)
  5. Ibid (pp.14-18)
  6. Ibid (pp.18-26)
  7. Ibid (pp.26-31)
  8. Ibid (pp.31-35)
  9. Adorno, Theodor, Minima Moralia: Reflections From Damaged Life [Verso Books Edition 2005] (pp.200-201)
  10. Pariser, Eli, The Filter Bubble: What the Internet is Hiding From You [Penguin Books, New York 2011] (Introduction)
  11. [1]
  12. [2])
  13. Ibid
  14. This idea is borrowed From Christopher Hitchens who memorably referred to Heaven as a "celestial North Korea[...] Who but a slave desires such a ghastly fate?". We have forwarded this here as "Digital North Korea".
  15. Pariser, Eli, The Filter Bubble: What the Internet is Hiding From You [Penguin Books, New York 2011] (Introduction)
  16. Ibid
  17. Papacharissi, A., Zizi. (2013). A private sphere. In Papacharissi (Ed.), A private sphere: Democracy in a digital age (1st ed., pp. 131). Hoboken, NJ: Wiley.
  18. Reuters Institute for the Study of Journalism, University of Oxford. (2016). Digital news report 2016. Retrieved from http://www.digitalnewsreport.org/
  19. Reuters Institute for the Study of Journalism, University of Oxford. (2016). Digital news report 2016. Retrieved from http://www.digitalnewsreport.org/
  20. https://www.wired.com/2016/11/filter-bubble-destroying-democracy/>
  21. (boyd, 2007, Harg, Hai, 2007 as cited in Papacharissi, A., Zizi. (2013). A private sphere. In Papacharissi (Ed.), A private sphere: Democracy in a digital age (1st ed., pp. 131). Hoboken, NJ: Wiley.)
  22. Papacharissi, A., Zizi. (2013). A private sphere. In Papacharissi (Ed.), A private sphere: Democracy in a digital age (1st ed., pp. 131). Hoboken, NJ: Wiley.
  23. Papacharissi, A., Zizi. (2013). A private sphere. In Papacharissi (Ed.), A private sphere: Democracy in a digital age (1st ed., pp. 131). Hoboken, NJ: Wiley.
  24. Adee, S. (2016). How can facebook and its users burst the ‘filter bubble’?. Retrieved from https://www.newscientist.com/article/2113246-how-can-facebook-and-its-users-burst-the-filter-bubble/
  25. Adee, S. (2016). How can facebook and its users burst the ‘filter bubble’?. Retrieved from https://www.newscientist.com/article/2113246-how-can-facebook-and-its-users-burst-the-filter-bubble/
  26. EL-BERMAW, M. M. (2016, Your filter bubble is destroying democracy. doi:11/08/2016 https://www.wired.com/2016/11/filter-bubble-destroying-democracy/
  27. Papacharissi, A., Zizi. (2013). A private sphere. In Papacharissi (Ed.), A private sphere: Democracy in a digital age (1st ed., pp. 131). Hoboken, NJ: Wiley.
  28. Rosen, J. (2006, JUNE 27, 2006). The people formerly known as the audience. Retrieved from http://archive.pressthink.org/2006/06/27/ppl_frmr.html
  29. Papacharissi, A., Zizi. (2013). A private sphere. In Papacharissi (Ed.), A private sphere: Democracy in a digital age (1st ed., pp. 131). Hoboken, NJ: Wiley.
  30. Papacharissi, A., Zizi. (2013). A private sphere. In Papacharissi (Ed.), A private sphere: Democracy in a digital age (1st ed., pp. 131). Hoboken, NJ: Wiley.
  31. Papacharissi, A., Zizi. (2013). A private sphere. In Papacharissi (Ed.), A private sphere: Democracy in a digital age (1st ed., pp. 131). Hoboken, NJ: Wiley.
  32. Fuchs, Christian (2013). accId=10153&cite=1&isbn=9781446296868&maxCopy=29&maxPrint=29&mmLimit=0&notes=1&pageLimit=0&shareLink=%2F%2FWWW.VLEBOOKS.COM%2FVLEWEB%2FPRODUCT%2FINDEX%2F388356&startPage=0&timestamp=2017-03-06T20%3A40%3A30&userId=789293&watermark=++++++++++++++++++++++++++++++STIRLING%2FMMJZAZRQUE1CEWQ2CEXXUJFDZKVDRLY2RZDVPQ++++++++++++++++++++++++++++++&token=J5SNl6qAvTPcD1uIA7C5UZ95Fjo%3d Social Media: a critical introduction. Los Angeles,CA: SAGE. ISBN 9781446296868. {{cite book}}: Check |url= value (help)
  33. Fuchs, Christian (2013). Social Media: a critical introduction. Los Angeles,CA: SAGE. pp. 126–178. ISBN 9781446296868.
  34. https://www.youtube.com/watch?v=Va07q3HFEZQ&feature=youtu.be
  35. Fuchs, Christian (30 September 2015). "Surveillance and Critical Theory" (PDF). Media and Communication. 3 (2): 6–9. doi:10.17645/mac.v3i2.207.
  36. Fuchs, Christian (2013). Social Media: a critical introduction. Los Angeles,CA: SAGE. pp. 126–178. ISBN 9781446296868.
  37. Fuchs, Christian (2013). Social Media: a critical introduction. Los Angeles,CA: SAGE. p. 158. ISBN 9781446296868.
  38. http://www.independent.co.uk/life-style/gadgets-and-tech/news/investigatory-powers-bill-act-snoopers-charter-browsing-history-what-does-it-mean-a7436251.html
  39. Fuchs, Christian (2013). Social Media: a critical introduction. Los Angeles,CA: SAGE. ISBN 9781446296868.
  40. “New Media:determining or determined”
    —Jon Dovey, Seth Giddlings, Martin Lister, New Media: A Critical Introduction
  41. “New Media:determining or determined”
    —Jon Dovey, Seth Giddlings, Martin Lister, New Media: A Critical Introduction
  42. “Networks Without a Cause”
    —Geert Lovink, New Media: A Critique of Social Media
  43. “Networks Without a Cause”
    —Geert Lovink, New Media: A Critique of Social Media
  44. “Internet privacy concerns confirm the case for intervention”
    —Roger Clark, Magazine Communications of the ACM CACM Homepage archive Volume 42 Issue 2, Feb. 1999 Pages 60-67
  45. “A Private Sphere”
    —Zizi Papachrissi, Democracy in the Digital Age
  46. https://www.wsj.com/articles/facebook-revenue-soars-on-ad-growth-1461787856
  47. https://market.us/statistics/facebook/
  48. https://market.us/statistics/facebook/
  49. http://www.cnbc.com/2016/07/26/twitter-reporting-second-quarter-2016-earnings.html
  50. http://www.cnbc.com/2016/02/18/snapchats-ad-rates-drop-as-brands-find-problems.html
  51. http://www.pcmag.com/encyclopedia/term/40334/cookie
  52. https://ico.org.uk/for-organisations/guide-to-pecr/cookies-and-similar-technologies/
  53. Carey, James (1989) Communication as Culture, New York: Routledge
  54. Van Dijk, T. A. Power and the news media, University of Amseterdam
  55. Bennett, W. L., & Segerberg, A. (2011). Digital media and the personalization of collective action: Social technology and the organization of protests against the global economic crisis. Information, Communication & Society, 14, 770-799
  56. Norris, P. (2000). A virtuous circle: Political communications in postindustrial societies. Cambridge, UK: Cambridge University Press.
  57. http://libertyunyielding.com/2017/03/06/mainstream-megaliths-losing-sense-objectivity-writing-trump/
  58. Why is Trump surging? Blame the media. - The Washington Post
  59. Schudson, Michael News and Symbolic Power (1995, pp3)(https://dspace.stir.ac.uk/bitstream/1893/8781/1/Interpreting%20News%20Introduction.pdf
  60. Gil de Zúñiga, H., & Valenzuela, S. (2011). The mediating path to a stronger citizenship: Online and offline networks, weak ties, and civic engagement. Communication Research, 38, 397-421
  61. Meikle, G. (2008). Naming and shaming: News satire and symbolic power. Volume 18 Numbers 2, 3, & 4,
  62. Meikle, G. (2008). Naming and shaming: News satire and symbolic power. Volume 18 Numbers 2, 3, & 4,
  63. Papacharissi, A., Zizi. (2013). A private sphere. In Papacharissi (Ed.), A private sphere: Democracy in a digital age (1st ed., pp. 152). Hoboken, NJ: Wiley.
  64. Parisher, Eli. Beware online "filter bubbles". Ted Talk, 2011. Accessed at https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles/transcript?language=en
  65. https://en.oxforddictionaries.com/definition/data_trail
  66. Filter Bubble
  67. https://www.google.co.uk/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#safe=active&q=new+media+meaning&*
  68. Private sphere