Professionalism/Tristan Harris, Time Well Spent, and Smartphones

In 2013, after a Google internal presentation deemed "A Call to Minimize Distraction & Respect Users' Attention" [1], Tristan Harris launched the Time Well Spent movement, raising awareness of the way technology hijacks the minds of users.[2]. Five years later, many large technology corporations, including Google, released initiatives and mission statements to improve users' "digital wellbeing"[3]. The actions raise concerns about smartphone usage and the ethics of persuasively designed technology.

Tristan Harris and His Pursuit for Change

edit

Tristan Harris, a Stanford University graduate and business owner, is a well-seasoned technological professional with many years of experience under his belt. From studying Computer Science in University, interning at Apple Inc, launching his own search browser startup[4], and later being acquired by Google[5], Harris has had a breadth of industry experience.

He has similar interest in the realm of psychology— from his fascination with magicians in his childhood to pursuing a masters studying the psychology of behavior change[6]. Tristan's concern deepened in his years spent as a Design Ethicist at Google, where he "soon realized that Gmail distracts people rather than relieves them"[7], directly creating systems that are harming users.

His concern goes beyond the technological walls of Google— arguing that the effect of social media and technology “feels like a downgrading of humans, a downgrading of humanity, …, a downgrading of democracy, a downgrading of our sense of decency.”[8] His concern is the attention economy. He contends that the danger isn’t in the strength of the technology, but when the technology overpowers human weaknesses and exploits vulnerabilities for profit. The design of the systems is at fault and attention should turn to the companies creating such smartphone addiction, according to Harris’ experience in the design field.

His first-hand experiences and studies in psychology create the framework guiding his overarching question: "How do you ethically steer the thoughts and actions of two billion people's minds every day?"[7]

Harris' Call to Google Employees

edit

Working as a project manager (PM) at Google, Tristan shared a slide deck to convey his concern that current design strategies create excessive user distraction. The presentation calls his co-workers to action “to create a new design ethic that aims to minimize distraction.”[1] It asserts that designers at influential technology companies, like Apple, Google, and Facebook, have an “enormous responsibility” to standardize design ethics because of the vast amount of users who spend time on their platforms.[1] Current design strategies overly distract and waste users’ time. The slide deck argues that certain psychological “vulnerabilities can be amplified and exploited” through the design of products to make people act against their better judgment.[1]

The slide deck explains why Google must pioneer design ethics. Since successful products “compete by exploiting these vulnerabilities,” smaller start-up companies cannot afford to focus on ethics.[1] As a result, change can only happen from the top down. Likewise, it concludes that Google needs to start prioritizing their design ethics because consumers “trust [them] to make conscious decisions.”[1]

As recounted in an interview with Das Magazin, Harris arrived to work the next day with an inbox of "emails with answers" and quickly got the attention of Larry Page.[7] Despite the popularity, as well as Harris being promoted to one of Google’s first design ethicist positions, nothing changed. As Harris stressed, "the public did not know that there was a problem".[7]

The Chemical Component of Persuasive Design

edit

Persuasive design has been questioned for over seventy years. In a 1947 essay titled “The Engineering of Consent”[9], Edward Bernays contends that anyone through “media may influence the attitudes and actions of our fellow”[9] people.

While technology and social media today is not inherently addictive, the dependency comes from the social environments and dopamine feedback loops they provide. Dopamine is the same chemical messenger “associated with food, exercise, love, sex, gambling, drugs”[10]. It’s the feel-good chemical that psychologist B. F. Skinner studied in the 1930s, noting that rewards trigger dopamine, which further solidifies habit formation. Dopamine is generated and released through three main pathways every time a response to a stimulus results in reward. Each response strengthens the neural connections, known as long-term potentiation.

 
The Dopamine Pathways

In the context of social media, the trigger to release dopamine could be a notification from Facebook, a 'like' on Instagram, or a buzz of a text message. The human mind associates such triggers with positive social stimuli, as found in a recent Harvard University research study[11]. Social media and smartphones provide immediate rewards, rewiring the brain to learn to desire notifications and ‘likes’.

This dopamine-driven feedback loop is at the core of any addiction. It is no secret that “many techies and marketers are tapping”, many times intentionally, “into decades of neuroscience research to make their products as addictive and profitable as possible”[12]. Preying on human vulnerabilities has been the basis of design beyond the technological industry. For example, supermarkets have been strategically designed, from the floor plan to the shelf layout, to entice the human mind into spending money. As noted by National Geographic, the dairy aisle is almost always located as far away from the entrance as possible, ensuring that customers will pass “a wealth of tempting products” en route to the milk[13]. Designers of all products are well informed on how to manipulate consumers - it’s chemical.

Center for Humane Technology

edit

In February of 2018, Harris and a team of ex-tech executives launched a nonprofit organization called the Center for Humane Technology (CHT) that works to "realign technology with humanity” and “reverse human downgrading." CHT claims that technology companies are creating societal problems: “shortened attention spans,” “addicting children,” and “turning life into a competition for likes."[14] They rely on donations and support from funders, like the Knight Foundation, Ford Foundation, and others. All associated organizations have a common mission related to “social good”[15] and fostering “informed and engaged communities."[16]

Although the problems are rooted in the design, CHT calls on policy makers, investors, and researchers to advocate for change. “Thought leadership, pressure, and inspiration” are methods used by the nonprofit to promote stakeholder intervention. CHT provides resources to stakeholders, such as a design guide to help designers take “meaningful steps towards designing a more humane product”[17] and a document to “develop & share tactics that are having positive results for students, parents and teachers."[18]

Although the call to action is directed towards influencing makers and regulators of technology, the Center for Humane Technology provides the users with suggestions to improve device relationships: deletion of all social media, turning off notifications from applications, and removing unhelpful applications from the home screen that do not aid in daily tasks. All of the organization’s suggestions are intended to help users “take control."[19]

Impacts

edit

Tech companies such as Google, Facebook, and Instagram are continually making changes to their platforms to support users’ ability to control the time and frequency of interactions with their devices. On August 1, 2018, Instagram and Facebook issued a press release regarding “tools to help people manage their time” including “an activity dashboard, a daily reminder and a new way to limit notifications."[20] The company allows users to see how much time they have spent on the app per day and provide a tool to notify the user upon reaching a predetermined daily time limit. Additionally, Facebook claims their team has modified the 'News Feed' to “reduce passive consumption of low-quality content— even if it decreases some of our engagement metrics in the short term."[21] Although the press release fails to mention Tristan Harris or the Center for Humane Technology, the changes were “based on collaboration and inspiration from leading mental health experts,” “organizations,” and “academics."[20]

Like Facebook and Instagram’s initiative, Google’s Digital Wellbeing initiative aligns with the Center for Humane Technology’s agenda without mentioning influence from Tristan Harris. Google claims “technology should improve life, not distract from it."[22] The team at Google supports this claim by working to “facilitate disconnection,” “reduce temptation to re-engage,” and “allow for partial disconnection” [23] on devices and platforms including Android, Youtube, Google Assistant, and Gmail. A dashboard for Android phones provides a “daily view of the time spent on your phone, how frequently you use different apps, and how many notifications you get."[24] Youtube recently launched an optional feature that “will pause your video” and remind users to “take a break." Gmail allows users to “turn on high-priority notifications to limit the number of email alerts” received. [25]

Apple's 'Screen Time' lets users "know how much time" they "spend on apps, websites, and more,” while App Limits allows users to “set daily limits for app categories."[26]

Conclusion

edit

Keep the customer satisfied. Keep the consumer dissatisfied. The real customers of companies like Google and Facebook are advertisers, while users are treated as consumers. Customers pay for the goods and services they want, but consumers are coaxed into using a product more than necessary. Persuasive design takes advantage of human psychology. Additional time on the platforms is a cost to consumers, yet value to advertisers.

Designers do not have an established ethical code to follow and instead rely on their own judgement. The boundary between designers' responsibility and users' self-control is unsettled. Professionals, like Tristan Harris, should create ethical frameworks because no fully comprehensive ethical guideline exists. An internal ethical framework allows a professional to establish their beliefs of right and wrong.

Tristan shows the importance of speaking up when moral boundaries are crossed, regardless of the prevailing ethics. Sharing his beliefs, Tristan risked losing his job— arguing that exploiting psychological vulnerabilities to increase user screen time is unethical. Companies like Google, Facebook, and Apple have begun to focus more on ethical design as a result of Harris' actions.

User design ethics and best practices are still developing. It is unclear whether companies are committed to ethical design in the long term or solely for public relations. Future work is necessary to understand the impacts of the Time Well Spent movement and the motives behind the recent ethical design shifts.

References

edit
  1. a b c d e f http://minimizedistraction.com/
  2. "The Problem", Center for Humane Technology. http://humanetech.com/problem
  3. "Great technology should improve life, not distract from it", Digital Wellbeing. https://wellbeing.google/
  4. https://www.apture.com/
  5. https://www.google.org/
  6. http://www.tristanharris.com/
  7. a b c d https://mobile2.12app.ch/articles/26555236
  8. https://www.wired.com/story/tristan-harris-tech-is-downgrading-humans-time-to-fight-back/
  9. a b https://web.archive.org/web/20120813014102/http://gromitinc.com/lego/Library/Engineering_of_consent.pdf
  10. https://now.northropgrumman.com/this-is-your-brain-on-instagram-effects-of-social-media-on-the-brain/
  11. http://sitn.hms.harvard.edu/flash/2018/dopamine-smartphones-battle-time/
  12. https://www.npr.org/sections/alltechconsidered/2013/07/24/204621796/ONLINE-REWARDS
  13. https://www.nationalgeographic.com/people-and-culture/food/the-plate/2015/06/15/surviving-the-sneaky-psychology-of-supermarkets/
  14. https://humanetech.com/
  15. https://www.fordfoundation.org/
  16. https://knightfoundation.org/
  17. https://humanetech.com/designguide/
  18. http://humanetech.com/wp-content/uploads/2019/04/Education-in-the-Age-of-Distraction.pdf
  19. https://humanetech.com/resources/take-control/
  20. a b https://instagram-press.com/blog/2018/08/01/new-tools-to-manage-your-time-on-instagram-and-facebook//
  21. https://newsroom.fb.com/news/2017/12/hard-questions-is-spending-time-on-social-media-bad-for-us/
  22. https://wellbeing.google/
  23. https://www.blog.google/products/android/search-jomo-new-research-digital-wellbeing/
  24. https://wellbeing.google/
  25. https://support.google.com/youtube/answer/9012523?co=GENIE.Platform%3DAndroid&hl=en
  26. https://support.apple.com/en-us/HT208982)