Issues in Interdisciplinarity 2018-19/Evidence for the Implications of the Cow in India

Issues in Interdisciplinarity 2018-19

The current, editable version of this book is available in Wikibooks, the open-content textbooks collection, at

Permission is granted to copy, distribute, and/or modify this document under the terms of the Creative Commons Attribution-ShareAlike 3.0 License.

Disciplinary Categories and Reframing Deforestation in Guinea

Guinea Regions map

This chapter aims to explore how disciplinary categories can create knowledge borders, leading to a lack of information flow within problem-solving, and how hierarchy among disciplinary categories might lead to the assumption that one certain solution is best.

Disciplinary categories can be applied to a variety of contexts, therefore its precise meaning will naturally vary. As a working definition for this chapter, we understand disciplinary categories to be the bordered fields of academia.[1] For example, mathematics and anthropology are different disciplinary categories. The rigidity and distinction in academic disciplines are intrinsic in its etymology, and these characteristics can lead to disregarding ideas that oppose the accepted canon.[2]

Thus, there is frequently a lack of interaction between different disciplines, especially in policy-making. This prevents us from reaching holistic conclusions when thinking about real-world problems.

To present these issues of disciplinary categorisation in context, we will discuss a case study regarding environmental conservation in Guinea based on the research of Leach and Fairhead. It is a piece of interdisciplinary work exploring the previous absence of communication between disciplines, which led to essential evidence being omitted or misinterpreted, hence forming impartial conclusions.

This chapter will then draw out some key issues regarding disciplinary categories and their interactions in solving issues, in relation to this case study.

Case study edit

Woodland in Guinea Savanna

Leach and Fairhead's research took place in Kissidougou; a city located in Guinea's savanna-forest transition zone,[3] which was believed to be undergoing a deforestation crisis.[4]

Administrators and ecologists have maintained that forest cover in Guinea had significantly reduced since 1995.

Pieces of evidence predominantly provided by scientific disciplines, have formed a narrative with limited perspectives. Through extensive mathematical modelling and ecological analysis, policymakers and scientists established the change in the lifestyles and land use of the Kissidougou people, along with population growth,[5] to be the primary reasons of deforestation there. Their research included identification of certain species of trees and other forms of vegetation, that are typically found in the outskirts of forest patches, which led to experts concluding that deforestation had occurred.[3]

Leach and Fairhead challenged this by conducting research alongside existing data encompassing several disciplines such as history, economics, archaeology and anthropology, revealing that the emergence of these variant forest patches was primarily due to intervention by local communities,[5] rather than as a result of deforestation.

The pair spoke to villagers about the history of the Kissidougou and consulted aerial photographs that showed the area's vegetation history.[6] The data not only revealed that there actually was an increase in forest patches, but also proved that the 'rapid population growth causing deforestation' narrative was unfounded. Through analysing broader regional history, anthropological evolution, and archaeology, they found that certain areas had significantly higher rural populations in the 19th century than in 1995.[5] Therefore, when viewing the problem through an interdisciplinary lens, it becomes apparent that the change in population demographics did not result in degradation.

The researchers also included socioeconomic analysis of Kissidougou's population to explain how their activities impacted the vegetation in the area. The villagers had adapted the land to suit changing socioeconomic conditions. They switched from coffee planting to fruit tree planting after the post-colonial period, due to falling prices, and this helped nurture the creation of the forest patches.[5]

It is clear from this case study that a more appropriate solution can be reached considering evidence from other disciplinary categories.

A question to consider is whether the research of academic disciplines that are seemingly more objective or use quantitative data, tend to naturally be more conducive to policy-making. Difficulties arise from assessing the weight of quantitative data from some disciplines against the qualitative of others. The quality of objectivity might give leverage to 'inform policies to address [issues]', compared to other disciplinary categories that may hold other perspectives.[7] By calculating the loss of forest cover mathematically and using these figures to drive a decision to impose policies, we neglect the understanding of important cultural values and livelihoods of locals.[7]

Range of research methodology among academic disciplines edit

Data is essential in solving real-world problems, but different academic disciplines are anchored to different methods of collecting data. This can lead to differing evidence, and perhaps then a different truth and approach to solving the same issue. Research methodologies include interviews, content analysis, focus groups and language-based analysis to name a few.[8] This becomes a problem in interdisciplinary work, when disciplines disagree about the way that research is being conducted.

However, as exemplified by the case study of deforestation in Kissidougou, by integrating viewpoints from other disciplines in discussions about how research should be conducted for particular issues, it can help highlight weaknesses of the research and factors it may be neglecting in the methodology.[2]

Communication between and perceived hierarchy among disciplines edit

An issue that should be addressed whilst incorporating multiple disciplines into solving an issue is how they will interact. Academic disciplines can be seen as communities, with ‘distinctive cultural characteristics’ and ‘cultural differences’. Tony Becher, a professor at the University of Sussex, writes that 'disciplinary groups can usefully be regarded as academic tribes, each with their own set of intellectual values and their own patch of cognitive territory'.[9]

Research integrating both quantitative and qualitative methods is becoming increasingly common. A solution to overcoming the borders of academic disciplinary research is employing overall designed systems for mixed-method research, named ‘typologies'. However many of these have been constructed in theoretical terms and are yet to be tested in real-world examples. These typologies draw attention to questions such as: which has priority, the qualitative or quantitative data? Is there more than one data strand? Are the types of data from each discipline collected simultaneously or sequentially?[8]

It is often the case that some disciplinary categories tend to 'dominate’ in terms of their input on a range of issues, according to a hierarchical structure.[10] This is dependent on what kinds of particular perspectives are reinforced, usually by authoritative bodies. Placing a greater weight on viewpoints from a certain discipline can lead to disregarding those derived from other disciplines. This was observed in the case study on Kissidougou, where mathematical and ecological perspectives were primarily considered and supported by policymakers, but perspectives from anthropology and history were overlooked.

Conclusion edit

As a counter-point to the advocacy of interdisciplinarity in solving complex issues, it might be that not all issues benefit from working across disciplines. What is important, however, is that academic disciplines are subjected to scrutiny from alternative approaches and disciplines. Encouraging communication and opening up a dialogue will always be beneficial.[2]

As seen from the case study, there becomes a need to pull research out of disciplinary silos to solve complex problems more holistically.[11] By travelling over the borders of academic disciplines, regardless of differences in methodologies, terminology and evidence, greater validity and a more comprehensive account of an area of inquiry can be reached.[8] This can open the doors to new solutions, which incorporates crucial knowledge required to initiate productive change.[12]

See also edit

External links edit

  • Fields of Knowledge, a zoomable map outlining different academic disciplines and their sub-sections.
  • Second Nature, a documentary based on Leach and Fairhead's research in Guinea.

References edit

  1. Oxford English Dictionary. “discipline, n.”. [Internet]. [cited 2018 Nov 29]. Available from:
  2. a b c Harriss J. The case for cross-disciplinary approaches in international development. World development. 2002;30(3):487–96.
  3. a b Fairhead J, Leach M. Webs of power: forest loss in Guinea. Seminar in New Delhi; 2000. p. 44–53.
  4. Shepherd J. Melissa Leach: Village Voice. [Internet]. The Guardian; 2007 [cited 2018 Nov 28]. Available from:
  5. a b c d Fairhead J, Leach M. False Forest History, Complicit Social Analysis: Rethinking Some West African Environmental Narratives. World Development. 1995;23(6):1023-1035.
  6. Fairhead J, Leach M. Reading Forest History Backwards: Guinea's Forest–Savanna Mosaic, 1893–1993 [Internet].; 1995 [cited 2018 Dec 5]. Available from:
  7. a b Fairhead J, Leach M. Reframing Deforestation Global Analyses and Local Realities: Studies in West Africa. 1st ed. London: Routledge; 1998.
  8. a b c Bryman A. Integrating quantitative and qualitative research: how is it done?. Qualitative Research. 2006;6(1):97-113.        
  9. Becher T. The significance of disciplinary differences. Studies in Higher Education. 1994;19(2):151-161.
  10. Brew A. Disciplinary and interdisciplinary affiliations of experienced researchers. Higher Education. 2007;56(4):423-438.
  11. Stirling A. Disciplinary dilemma: working across research silos is harder than it looks. [Internet]. The Guardian; 2014 [cited 2018 Nov 26]. Available from:
  12. Jacobs JA. The Critique of Disciplinary Silos. In defense of disciplines: Interdisciplinarity and Specialization in the Research University. 1st ed. Chicago, Illinois: University of Chicago Press; 2014. p.13-26.

Disciplinary Categories and Their Effect On Gender Perception

The Androgynous Form of Shiva and Parvati in Hindu Mythology

This article will analyse the categorisation of gender through various disciplines. Exhibiting the relevant issues in each case suggest that the use of interdisciplinarity could help us arrive at more holistic inferences on such a controversial subject.

Disciplinary Categories edit

Disciplinary categories are results of breaking down academia into its constituent subject topics, termed disciplines, based on their content and research methods. These disciplines are then assigned to broad categories like humanities, social sciences, and natural sciences. They are devised to organise various fields of knowledge, resulting in many institutions sharing the same system of classification. Consequently, they appear universal and absolute. However, dissent arises not only on the labels of categories, but also their non-mutually exclusive content. For instance, Economics can be a social science or an empirical science depending on the use of a qualitative or quantitative approach[1]. Fluctuations in disciplinary categories have also been witnessed over time and in different geographical locations, such as the creation of gender studies, initially women’s studies, as a new discipline in the 20th century West[2].

Similar issues of categorisation occur in human identification under the lenses of gender and sexuality.

Categorising Humans edit

Upon first interaction with a person, it takes only 600ms to recognise their sex[3]. The human brain immediately begins categorising the person based on factors like sex, race, and age[4]. However, studies increasingly reveal the ambiguity of boundaries between the elements of these categories[5]. French sociologist Christine Delphy observes that most work on gender presupposes that ‘sex precedes gender’[6]; sex being a biological function and gender a cultural identifier to separate traditional masculinity and femininity. However, an incoherence arises upon viewing the issue from different disciplines, indicating that the categorisation is not as universal as it appears.

Perception of Gender in Different Disciplines edit

Biology edit

Biology denotes the difference in sex as the distribution of XX and XY chromosomes, the possession of either male or female genitalia, and the balance of hormones in our body. Testosterone is associated with stereotyped masculinity because it increases competitiveness and aggression whereas oestrogen is associated with feminine characteristics of increased emotion[7]. This forms the basis for cultural perceptions of males being better suited to hard labour and females to childbearing and domesticity[8]. This approach is largely criticised for being too deterministic in its analysis of gender and/or sex categorisation. Studies have shown that children classify others from their clothes more easily than from their sexual organs[9] demonstrating the reliance humans place on gendered archetypes when identifying others[10]. Hence, this biological classification provides solid grounds for the empirical study of gender and sexuality but is rendered fallible in consideration of the nature vs. nurture debate by lacking individualism.

Economics and Politics edit

Historically, capitalism has typically enforced a division of jobs deemed 'masculine' and 'feminine'. Jobs seen as useful for creating revenue and furthering society are considered superior, therefore assigned to men, and paid[11], whilst women are expected to perform unpaid domestic tasks[12]. For example in the 1950s nuclear family concept, men were given high-standing jobs whilst women were left as housewives or under-paid secretaries. Second-wave feminism helped to reduce these differences, thus encouraging women to fully enter the workforce.

The economy further enforces this division on a daily basis through mass-media and advertising, supposedly appealing to the artificial stereotypes associated with each gender, like women's beauty products. Whilst this is in attempt to maximise profits through appealing to a specific audience[13], the capitalist approach in this sense is argued to hinder approaches to gender equality by inadvertently prescribing gender stereotypes in widespread media.

LGBT rights in the EU

Law edit

Many countries have started implementing laws regarding gender allowing citizens freedom of identity regardless of their biological sex. In 2016, Norway permitted anyone to legally change their sex without surgery[14] and Canada made denying gender theory illegal[15]. The UK also adopted the Gender Recognition Act. Nonetheless, most countries still do not accept non-binary gender.

Both extremes of legal gender recognition create conflicts in society. Countries not accepting of a third gender, gender reassignment surgeries, or name changes have been attacked by the left wing for their intolerance. Contrastingly, legalising gender changes before surgery is viewed as a security threat towards women due to possible abuse of the system. This occurred in Norway in 2016, when a woman felt uncomfortable when a biological male, legally identifying as female, entered the female bathroom of a gym. The woman talked to the staff about feeling unsafe, and ended up being sued for harassment. The court later ruled against any such claims[16][17].

As laws tend to dictate a public paradigm of right vs. wrong, the implementation of legal rulings regarding gender has dramatically helped the normalisation of non-binary gender. Yet, legal proceedings on such a divided issue have simultaneously given rise to further conflicts regarding expansion or retraction of these laws.

Linguistics edit

Number of Genders in Languages of the World

In the West, it is common practice to indicate an individual's autonomous identity with the favoured pronoun he, she, they, or ze. A topic receiving much controversy is the use of pronouns as enforcing a binary of male and female. To incorrectly identify someone’s gender is construed as offensive. Hence, the use of additional pronouns introduces acceptance into a language. Whilst it may be positive to expand the lexical field of gender, each pronoun carries assumptions of gender norms. Just as ‘he’ implies masculinity, gender neutral pronouns can inadvertently cause associations with LGBTQ+ stereotypes[18].

This problem is expanded in gendered languages where it is necessary to express gender in many aspects such as adjectival agreement in French or the existence of gendered first-person pronouns in Japanese[19]. Such a range of identifiers can be problematic and cause further discrimination but are crucial for cultural expression. Oppressive regimes often prohibit gender descriptors differing from the traditional heterosexual male and female roles. Therefore, some people only discover their identity upon encountering a word to describe it; for example Jang Yeong-Jin only realised his homosexuality upon leaving North Korea[20]. This demonstrates that linguistics allow for both further expression and stereotyping of gender, inviting debate upon whether it is advantageous to have such a range of identifiers.

Conclusion edit

The aforementioned disciplines are a small part of a greater discussion, but they suffice to illustrate the conflicting nature of gender categories. While natural sciences focus on sex and physical features, social sciences discuss the cultural norms attached to gender. Moreover, these terms are often used interchangeably, defined differently by almost all individuals. However conflicting, we still need these categories to follow our innate need to classify and better perceive the world and to accept and recognise the third gender.

Still, we must remember that any definitions are in no way absolute. Any broad topic must be broken down for thorough study, but it is imperative to keep in mind that the whole is greater than the sum of the parts, both in academia and gender and sexuality. Thus, one should utilise interdisciplinarity and systems thinking, exploring every related discipline, to arrive at solutions that better reflect reality. We must remember the interconnected cause and effect relationship amongst all the elements and the consequences of oversimplifying any processes involved.

References edit

  1. Chetty R. Yes, Economics Is a Science. The New York Times. 2013 Oct 20. Available from:
  2. Kaplan G, Bottomely G, Rogers L. Ardent warrior for women's rights. The Sydney Morning Herald. 2003 Jul 31. Available from:
  3. Bruce V, Burton A, Hanna E, Healey P, Mason O, Coombes A et al. Sex Discrimination: How Do We Tell the Difference between Male and Female Faces?. Perception. 1993;22(2):131-152.
  4. Baudouin J, Tiberghien G. Gender is a dimension of face recognition. Journal of Experimental Psychology: Learning, Memory, and Cognition. 2002;28(2):362-365.
  5. Richards C, Bouman W, Seal L, Barker M, Nieder T, T’Sjoen G. Non-binary or genderqueer genders. International Review of Psychiatry. 2016;28(1):95-102.
  6. Delphy C. Rethinking sex and gender. Women's Studies International Forum. 1993;16(1):1-9.
  7. DeCecco J, Elia J. A Critique and Synthesis of Biological Essentialism and Social Constructionist Views of Sexuality and Gender. Journal of Homosexuality. 1993;24(3-4):1-26.
  8. Geary D. Male, female: The evolution of human sex differences. American Psychological Association; 1998.
  9. Case M. Disaggregating Gender from Sex and Sexual Orientation: The Effeminate Man in the Law and Feminist Jurisprudence. The Yale Law Journal. 1995;105(1):1-5.
  10. Kessler S, McKenna W. Gender: An Ethnomethodological Approach. University of Chicago Press; 2001.
  11. Federici S. Caliban and the Witch: Women, The Body, and Primitive Accumulation. Brookyln, NY. Autnomedia. 2004
  12. Women shoulder the responsibility of 'unpaid work' - Office for National Statistics [Internet]. Office for National Statistics. 2016 Nov 10. Available from:
  13. Mager J. and Helgesson J. Fifty Years of Advertising Images: Some Changing Perspectives on Role Portrayals Along with Enduring Consistencies, Sex Roles, 2011,64:238–252
  14. Norwegian law amending the legal gender. Transgender Europe. 2016. Available from:
  15. Gender Identity and Gender Expression. Department of Justice Canada. 2017. Available from:
  16. Norway: A woman is accused of harassment for questioning a man who uses the women's changing room at a fitness centre. This is the translation of the case review. [Internet]. 2018. Available from:
  17. Danielsen I, Øygarden G, Aschehoug T. Sak 68/2018 [Internet]. Diskrimineringsnemnda; 2018. Available from:
  18. Dembroff R, Wodak D. He/She/They/Ze. Ergo: An Open Access Journal of Philosophy. 2018;5(14):371-403. Available from: http://ergo.12405314.0005.014
  19. McConnell‐Ginet S. “What's in a Name?” Social Labeling and Gender Practices. The Handbook of Language and Gender. Blackwell Publishing Ltd. 2003. 69-97. Available from:
  20. Kim J, Kim S. North Korea's only openly gay defector: 'it's a weird life'. The Guardian. 2016 Feb 18. Available from:

Evidence in Nutrition

Nutrition sciences is a relatively young discipline.[1] There is much debate about what is healthy and which metric is best for measuring health. When we consider the structures around weight and dieting, studying human health has as many social implications as it does scientific. In this chapter, we take an interdisciplinary approach to addressing issues of evidence in relation to nutrition, dieting, obesity and health. To best understand not only evidence itself, but also the way people engage with it, we include nutrition science, marketing and economics in our discussion of these issues.

Research in Nutrition Science edit

Modern nutrition science began in the early 20th century with the supplementation of food with vitamins, which led to a decrease in deficiency related diseases. In wealthy countries, the discipline then became focused on dietary fat and sugar as posing health risks. In 1977 the US Senate Committee on Nutrition and Human Needs published a somewhat controversial report titled Dietary Goals for the United States that recommended a low fat diet. The evidence used in this report was called into question and found insufficient by the US National Academy of Sciences, Food and Nutrition Board.[1]

Methodologies edit

Data in nutrition sciences are gathered in many ways, ranging from ecological case studies, to intervention experiments. The methodology used to address deficiency diseases, which involved isolating a nutrient, was applied to research on the impacts of sugar and fat in the 1950s-70s. However, these methods, while effective for their original purpose, were not well suited to non-communicable diseases like obesity. Researchers today still use a lot of the same methods, but hold their data to a much higher standard of accuracy. Additionally, more recent publications are less likely to make far reaching claims about the relative health of different foods.[1]

Funding and Biases edit

The rise of the internet has given the general population access to a wide variety of studies, articles and arguments regarding health; however, the evidence presented in these sources may not always be legitimate. This applies even to studies that appear to be academic, when we consider research sponsorship. There have been multiple cases in which the outcome of a study has favoured the interests of the funding body.[2] For example, the Coca-Cola Company conducted research to show that exercise is more important to ones health than nutrition.[3]

Media outlets, such as the Guardian and Inside Philanthropy, have drawn attention to the issue of corporate influence in academic research. It has also garnered attention from academics, such as Dr. Dariush Mozaffarian of Tufts University, who recently published this study on conflict of interest in nutrition research. He concludes that “evidence for substantial bias has been identified in conclusions of industry- sponsored systematic reviews regarding the health effects of sugar-sweetened beverages and artificial sweeteners.”[4]

The presence of food industry capital in research is enormous, and is closely linked to the industry’s presence in national health groups in countries like the USA.[5] These findings call into question the validity of the evidence for health and nutrition.

Health edit

The official definition of health, given by the World Health Organization (WHO) in 1948, states that "health is a state of complete physical, mental and social well-being and not merely the absence of disease of infirmity." [6] WHO fails to indicate how the subject is produced or measured.[7] Similarly, the OED and the NHSprovide scarce explanations of health. An article from the British Journal of General Practice defines ‘health’ as the capacity to make an adaptation to an environment, and is subject to a variety of forces that can change and damage us.[7] There is not only conflict over how to define health, but how to achieve that state. There is a large body of research and evidence, leading to few concrete conclusions.

Obesity and Related Diseases edit

There is extensive scientific research that suggests obesity is linked to problems with health, including cardiovascular diseases and Type II diabetes.[8] [9] The evidence regarding these findings is rather uncontroversial and accepted by both the scientific community and general population. Issues of evidence become more important in defining obesity. According to the World Health Organization (WHO), “Overweight and obesity are defined as abnormal or excessive fat accumulation that may impair health.” [6] There are multiple metrics used to measure obesity, including Body Mass Index (BMI) and percent body fat.

As stated in the introduction, evidence in nutrition is not just a scientific issue. It is also something we engage with on a social level. Society places high value on thinness, which impacts the general perception of health. If someone appears thin or "in shape" this is often taken as evidence of their good health. However, even people with normal BMIs can have what professionals call "normal weight obesity," which is also correlated with increased risk of cardiovascular disease and other health problems.[10]

Advertising edit

Cambridge Dictionary defines advertising as “the business of trying to persuade consumers to buy given goods or services.” [11] In order to achieve this end, advertisers must provide reasons, or evidence, to support their product. The main issue in diet advertising is the manipulation of data or inaccurate paraphrasing of scientific findings. So called “soft” health claims, such as “makes you healthy” are intentionally vague, resulting in their interpretation as valid health claims.[12] According to M. Katan, ¨the formulation of soft claims [is] a fine art, creating claims that imply health effects without actually naming a disease.¨ [13] The issue of evidence in diet advertising in important not only because of the link between diet and health, but also because advertising is a dominant tool in shaping health preferences and knowledge about food. This especially impacts children and teenagers, as they are the most vulnerable to external influence.[12]

Some evidence suggests that food marketing practices may have led to positive public health outcomes, by changing dietary habits among American customers.[14] [15][16] For instance, there has been a trend away from high-fat foods since the 1970s.[15] Additionally, a report from the American Marketing Association argues that health claims in advertising can transform markets.[17] Allowing truthful claims by manufacturers may benefit consumers, as it increases the competitive pressures on companies to market the nutritional features of foods.[15] Health claims can also be considered a legitimate educational tool.[12]

On the other hand, many argue that health claims in advertisements are “designed to deceive,” by withholding some scientific evidence.[18] One study found that most advertisements promote “energy-dense, nutrient-poor” food, which has a questionable health benefit.[19] There is also a link between the proliferation of health claims, and the change of nutritional public policy. These claims are commonly found on food products throughout the world, but their regulation varies widely among countries. A recent WHO survey reported that among 74 countries, 35 have no regulation on health claims.[20]

Evidence is important in making and defending arguments about the value of health claims, yet it does not definitively support either side of the debate. This subjective and inconclusive nature of evidence has impacts on our health, policy and society as a whole.

Resources edit

  1. a b c Mozaffarian D, Rosenberg I, Uauy R. History of modern nutrition science—implications for current research, dietary guidelines, and food policy. BMJ [Internet]. 2018;:361-392. Available from:
  2. Moodie A. Before you read another health study, check who's funding the research [Internet]. The Guardian. 2016 [cited 7 December 2018]. Available from:
  3. O’Connor A. Coca-Cola Funds Scientists Who Shift Blame for Obesity Away From Bad Diets [Internet]. New York Times Blog. 2015 [cited 7 December 2018]. Available from:
  4. Mozaffarian D. Conflict of Interest and the Role of the Food Industry in Nutrition Research. JAMA [Internet]. 2017 [cited 6 December 2018];317(17):1755-1756. Available from:
  5. Aaron D, Siegel M. Sponsorship of National Health Organizations by Two Major Soda Companies. American Journal of Preventive Medicine [Internet]. 2017 [cited 7 December 2018];52(1):20-30. Available from:
  6. a b Constitution of the World Health Organization. Bulletin of the World Health Organization [Internet]. 1946 [cited 6 December 2018];Basic Documents(45th Edition, Oct 2006):1. Available from:
  7. a b Tulloch A. What do we mean by health? British Journal of General Practice [Internet]. 2005 [cited 2018Dec3];55(513):320–3. Available from:
  8. Golay A, Ybarra J. Link between obesity and type 2 diabetes. Best Practice & Research Clinical Endocrinology & Metabolism [Internet]. 2005;19(4):649-663. Available from:
  9. Burke G, Bertoni A, Shea S, Tracy R, Watson K, Blumenthal R et al. The Impact of Obesity on Cardiovascular Disease Risk Factors and Subclinical Vascular Disease. Archives of Internal Medicine [Internet]. 2008;168(9):928. Available from:
  10. Palmer S. When Thin Is Fat — If Not Managed, Normal Weight Obesity Can Cause Health Issues. Today’s Dietitian Vol 13 [Internet]. 2011 [cited 6 December 2018];(1):14. Available from:
  11. Definition of “advertising” from the Cambridge Academic Content Dictionary © Cambridge University Press. Available from
  12. a b c Williams, P. (2005). Consumer Understanding and Use of Health Claims for Foods. Nutrition Reviews, 63(7), pp.256-264.
  13. Katan, M. (2004). Health claims for functional foods. BMJ, 328(7433), pp.180-181.
  14. Daily dietary fat and total food-energy intakes--NHANES III, Phase 1, 1988–91. JAMA: The Journal of the American Medical Association [Internet]. 1994;271(17):1309-1309. Available from:
  15. a b c Mathios A, Ippolito P. Food companies spread nutrition information through advertising and labels. Food Review. 1998;21:38-44.
  16. Stephen A, Wald N. Trends in individual consumption of dietary fat in the United States, 1920–1984. The American Journal of Clinical Nutrition. 1990;52(3):457-469.
  17. Calfee J, Pappalardo J. Public Policy Issues in Health Claims for Foods. Journal Of Public Policy and Marketing. 1991;10(1):33-53.
  18. Liebman B. Designed to Deceive. Nutrition Action Healthletter. 1999;(26):8.
  19. Lohmann J, Kant A. Effect of the Food Guide Pyramid on Food Advertising. Journal of Nutrition Education. 1998;30(1):23-28.
  20. Parker B. Food For Health – The Use Of Nutrient Content, Health, and Structure/function Claims In Food Advertisements. Journal of Advertising. 2003;32(3):47-55.

Evidence in the Gender Pay Gap

This chapter investigates the use of evidence within the Gender Pay Gap, both from a historical perspective, so as to establish this as an ongoing predicament, and from disciplinary lenses, highlighting current differences of opinion.

Historical Perspective edit

The Gender Pay Gap is a "measure of the difference between men’s and women’s average earnings".[1] It is commonly accepted that this disparity is not necessarily born from unequal pay for equal work but instead from a historically persisting custom of women's working restrictions, upholding cultural values, and labour division, with male careers often commanding higher wages.[2]

In 1906, twelve countries committed to a treaty[3] as part of a string of new laws dramatically limiting women's nighttime working-hours. To justify these laws courts often referred to "empirical evidence", originating from practical data collected from factory inspectors' reports and the observations of "medical men",[4] showing that night work cause negative physiological effects in women including loss of appetite, increasing morbidity and mortality.[5] Although the courts valued their evidence, we could question its validity based on whom it was collected by and how few people had access to this data.

In 1932, the BBC introduced a marriage bar[6] (the practice of terminating women's employment upon marriage or pregnancy, or not hiring married women[7]). Such restrictions were widespread during the interwar-years, and were even pursued in the public sector (within teaching, civil service and medicine[8]). It was argued that the bar was necessary in response to the economic depression and high male unemployment.[7] However, many felt that the economic rationale cited as evidence for the need of the bar was simply the publicly-presented evidence, and not the true reason why it was put into place. Commentators believed that a social consensus on women's participation in public life was instead to blame.[9] Here, we are faced with how evidence might be selected and manipulated to support one's own rationale.

These are just two of many historical examples of the ongoing issues related to evidence within the restriction of women's work.

Over time, the ways we collect and use evidence have dramatically evolved. This has allowed disciplines already addressing the issue (such as Economics) to delve further into the situation, as well as allowing a plethora of disciplines to approach the issue (such as Sociology and Psychology).

Current disciplinary perspectives edit

Economics uses quantitative evidence concerning disparities in wages and working hours between sexes to explain the gender pay gap. Their main focus is decomposing this evidence into particular gendered categories concerning age, field of study, level of education, interests and balance between home and work. By doing so, the proportion of this pay gap caused by gender discrimination decreases.

Indeed economists consider that data generalisations and lack of analysis of all relevant variables have caused the contradictions surrounding this gap.[10] [11] For instance, its political/social discourse mostly uses imprecise evidence such as:  “women’s wages”, whilst economic studies focus on highly specific data like “Hours distributions and hourly wage penalties and advantages for hourly workers across six occupational groupings, by sex”.[12]

The decomposed evidence used, such as studies of trends in variables and convergence analysis, have concluded that a majority of this pay gap can be explained by the difference in choices men and women make. For instance, economists (such as Harvard professor Claudia Goldin) support the use of Becker’s human capital theory to explain why women orient themselves towards the jobs they do.[13][14]

Sociology focuses on the reasons underpinning the degree of occupational sex segregation and why the sexual division of labour is significant in the difference in remuneration for both sexes. Whilst sociologists recognise that Becker's human capital theory plays a certain role in gender wage disparity, they argue that it in reality it only accounts for a fraction of the rift.[15] Rather than focusing on the autonomic "supply" side decisions, sociologists highlight broader cultural and infrastructural mechanisms as key contributors to the issue.[16]

To explain sexual division of labour, sociology focuses on gender socialization as a cause of sex segregation. Their evidence for this are sociological studies displaying a positive correlation between a society's emphasis on gender differences and the extent of sex segregation.[17] On a structural level, sociologists such as Reskin highlight personnel practices actively discouraging the mobility of sexes between certain occupations, particularly in the form of work-time schedules and work equipment.[17]

Sociology suggests sexual division of labour in the workforce is paramount as female dominated spheres of work earn less on average – not because they are able to offer less in terms of human capital but because typically female work is systematically and culturally undervalued.[18] [17] As a result, policies that ban direct discrimination fail to address the broader issue.

Psychology focuses on the sundry and specific behaviour of individuals, unlike sociological theory. They are interested in the wage-related impacts of confidence and individual’s personality as related to the skills of risk-taking, negotiation and competitive behaviours.[19]

Psychology uses evidence collected from psychometric instruments, including achievement motivation and personality trait scales, capturing confidence. This approach largely ignores the wider social forces that may explain the gender pay gap.

Over-confidence is also statistically modelled so as to produce quantitative evidence for the variables giving rise to pay differences. An example of this would be the Oaxaca-Blinder decomposition model that suggests we can investigate dissimilarities in gender characteristics to explain differences in their remuneration for said characteristics.[20]

Evaluating Evidence edit

We could argue that it was the lack of access to and manipulation of evidence that allowed for women's working restrictions to be sanctioned. Nowadays, however, we appear to be facing a different problem. Despite an abundance of accessible evidence, researchers today primarily interpret evidence from their own disciplinary perspective, often leading to a clash of opinions as explored above.

Economics often disregards qualitative or theoretical evidence, favouring quantitative empirical evidence. Whilst sociology values quantitative evidence, it recognises that it cannot sufficiently reflect the socio-cultural forces at hand and as a result gives equal precedence to qualitative evidence.[17][21] Furthermore, evidence used in psychology is individualistic, unlike the societal models of sociology, resulting in diverging conclusions regarding the pay gap.

Clearly, single-disciplined researchers tend to collect evidence from their own disciplinary perspective to inform their conclusions, lacking an understanding of that of other disciplines. Therefore, we believe in the need for interdisciplinary thinkers to overcome the lack of cohesion within the disciplinary perspectives. Due to their interdisciplinary foundations, an interdisciplinary researcher would have the capacity to approach and understand the theories provided by different disciplines without the ulterior motive to proliferate their own discipline's agenda. We feel an interdisciplinary approach would allow for a holistic interpretation of the breadth and depth of the evidence now available to us, hence reaching a more universal consensus.

Bibliography edit

  1. Equality and Human Rights Commission. What is the difference between the gender pay gap and equal pay?. [Internet] [Cited 8 August 2018, Accessed 6 December 2018]. Available at:
  2. Griffin, Emma. What’s to blame for the gender pay gap? The housework myth. [Internet] The Guardian. 12 March 2018. [Cited 7 December 2018] Available at:
  3. Treaty Series. No. 21. 1910. International Convention respecting the Prohibition of Night Work for Women in Industrial Employment. Signed at Berne, 26th September, 1906 (Treaties, Conventions, &c: Women (Night Work)). 20th Century House of Commons Sessional Papers. Command Papers, Cd. 5221, CXII.275. [Cited 1 December 2018] Available at:
  4. Goldmark, Josephine. Fatigue and Efficiency – A Study in Industry. New York: Charities Publication Committee; 1912. Page 252.
  5. Goldmark, Josephine. Fatigue and Efficiency – A Study in Industry. New York: Charities Publication Committee; 1912. Page 211. As referenced in: Woloch, Nancy. A Class by Herself: Protective Laws for Women Workers, 1890s–1990s. Princeton University Press; 2015. Page 93.
  6. Murphy, Kate. A marriage bar of convenience? The BBC and married women's work 1923–39. Twentieth Century British History. 2014; Volume 25 Issue 4. Pages 533-61. [Cited 2 December 2018] Available at:
  7. a b Sisterhood and After Research Team. Marriage and civil partnership. [Internet] Sisterhood and After, The British Library; March 2013. [Cited 2 December 2018] Available at:
  8. Sturge, Mary. THE MARRIAGE BAR. The Lancet. 8 October 1921: Volume 198, Issue 5119, Page 779. [Cited 3 December 2018] Available at:
  9. Redmond, Jennifer and Harford, Judith. “One man one job”: the marriage ban and the employment of women teachers in Irish primary schools. Paedagogica Historica. 2010: Volume 46, Issue 5, Pages 639-654. [Cited 2 December 2018] Available at:
  10. The Economist. Are women paid less than men for the same work? [Internet] The Economist Newspaper; 2017 [Cited December 9 2018]. Available at:
  11. Kai, Lui. Explaining the gender wage gap: Estimates from a dynamic model of job changes and hours changes. Faculty of Economics, University of Cambridge, Department of Economics, Norwegian School of Economics, and IZA. [Internet] 2016. Pages 411-412. [Cited 8 Dec 2018] Available at:
  12. Goldin C. Hours flexibility and the gender pay gap. Center for American Progress; April 2015. Pages 13-15.
  13. Goldin, C. Human Capital. In: Handbook of Cliometrics. Heidelberg, Germany: Springer Verlag; 16 March 2016.
  14. Tverdostup M. and Paas T. GENDER UNIQUE HUMAN CAPITAL AND LABOUR MARKET RETURNS. University of Tartu, School of Economics and Business Administration, Estonia. [Internet] 2017. Available at: file:///Users/isabelle/Downloads/25_Tverdostup_Paas%20(2).pdf
  15. England P. The Failure of Human Capital Theory to Explain Occupational Sex Segregation. The Journal of Human Resources [Internet]. 1982; 17(3):358. [Cited 1 December 2018] Available at:
  16. 5. Andersen J. The Gender Wage Gap: Exploring the Explanations. [Internet]. 2018. [Cited 1 December 2018] Available at:
  17. a b c d Reskin B, Bielby D. A Sociological Perspective on Gender and Career Outcomes. Journal of Economic Perspectives. [Internet] 2005;19(1):71-86. [Cited 2 December 2018] Available at:
  18. England P. Gender Inequality in Labor Markets: The Role of Motherhood and Segregation. Social Politics: International Studies in Gender, State & Society [Internet]. 2005; 12(2):264-288. [Cited 6 December 2018] Available at:
  19. Frédéric Palomino, Eloïc-Anil Peyrache. Psychological bias and gender wage gap. Journal of Economic Behavior & Organization; Volume 76, Issue 3, 2010, Pages 563-573.
  20. Leonora Risse, Lisa Farrell, Tim R L Fry. Personality and pay: do gender gaps in confidence explain gender gaps in wages? Oxford Economic Papers; Volume 70, Issue 4, 1 October 2018, Pages 919–949.
  21. Lips, H. The Gender Pay Gap: Challenging the Rationalizations. Perceived Equity, Discrimination, and the Limits of Human Capital Models. Sex Roles. 2012; 68(3-4):169-185.

Reliability of Legal Evidence

This chapter will explore the benefits and drawbacks of two types of legal evidence, eyewitness testimony and DNA, with reference to the disciplines of psychology and forensic science. It will evaluate the reliability of these two types of evidence, by examining their values and implications in the world of law.

Reliability in the Context of Different Disciplines edit

In law, reliability of evidence is the degree to which the examiner is able to rely upon it in coming to a decision.[1] However, its scope and limitations are directly influenced by what discipline the evidence stems from. In psychology, reliability is assessed through the level of consistency of research findings [2]. Eyewitness testimony places 100% reliance on the human memory, which may be tainted by several psychological influences [3], lowering its reliability. Alternatively, forensic science in law is highly authoritative and relied upon due to its validity and accuracy which has made DNA evidence increasingly unassailable [4], yet problems of reliability still exist due to its circumstantial and subjective nature.

Eyewitness Testimony as Legal Evidence edit

Eyewitness testimony in legal terms refers to an account provided by people who have witnessed first-hand the event under trial [5]. In a large share of criminal cases where reliable evidence may be scarce, courts often turn to eyewitness statements to secure their final conviction. Despite it being the main form of evidence in many cases [6], reports indicate that they can be severely inaccurate and are responsible for over 70% of the world's documented false convictions [7], resulting in an inherent trade-off between relevance and reliability.

Stabbing incident witness, Brisbane - 1942

Psychology of Eyewitness Testimony edit

The reliability of eyewitness testimony can be skewed by a variety of psychological factors; even everyday bodily influences such as anxiety/stress, memory decay and poor eyesight have been shown to influence false testimony [6]. Through research, psychologists have concluded that eyewitness evidence can be contaminated by an individual’s visual perception and may lead to incorrect reconstructions of the crime [8]. An often heavier influence to false testimony is “eyewitness talk”, whereby witnesses discuss their recollection of events among themselves and subsequently alter their own memory based on the evidence of their fellow witnesses [6], resulting in their inability to differentiate between their own memory and information learned after the incident [7]. An individual's memory reconstruction may also be inherently biased by their specific cultural background and values [9]. These psychological factors depict how heavily succumbed the human mind is to internal and external influences, no matter how confident the eyewitness may be, and the resulting unreliability of eyewitness testimony in legal trials, despite its prevalence in today’s legal system.

DNA as Legal Evidence edit

Legal Benefits of DNA Forensics edit

DNA Profiling

DNA profiling has been considered the biggest breakthrough in forensic sciences since the discovery of fingerprinting. Since the 1986 Pitchfork case, the use of DNA in the domain of forensic science has seen itself multiply exponentially. Alec Jeffreys, University of Leicester geneticist, used DNA to convict the double murderer and rapist, Colin Pitchfork, which cleared the name of the innocent suspect, Richard Buckland, making it the first legal case solved by DNA [10]. It is important to note that in this case, DNA was not used as evidence but rather helped the authorities pinpoint a suspect. The results observed and the range of use of DNA technology was applauded by many, popularizing its use in forensic labs worldwide[11].

The late 90's saw fast development in the use of DNA, leading to a normalization of its use in court. However, due to the margin for human error and skepticism, this method was not automatically adopted as the standard in court. This aversion to using DNA led to the creation of operations such as the Innocence Project (USA), whose goal was to use DNA testing positively in order to clear the name of those wrongfully accused. To date, it has helped 326 people in the 70% of cases where wrongful conviction was made due to eyewitness misidentification[12]. According to the Innocence Project, tens of thousands of cases in the USA have benefitted from DNA sampling since 1989. DNA has also changed the face of forensic science to a greater extent, as violent crimes, such as rape and murder, are most often committed by multiple time offenders, leading to the creation of DNA databases which can then be used in future investigations[13].

Drawbacks of DNA Forensics edit

Mishandling of DNA Forensics edit

DNA forensics are highly perceived as an irrefutable and reliable piece of evidence, which can overturn a defendant's adjudication[14]. In fact, any biological evidence collected at crime scenes can be analysed through DNA testing[15]. Although, this practicability has led to faulty forensic analysis of DNA, which have caused many wrongful convictions. In 2013, the New York medical examiner's office reviewed more than 800 rape cases that may have involved crime investigators who were introduced to reports with mishandled DNA evidence[14]. Mishandling of DNA evidence may include the swapping of items within labs, cross-contamination, or the disregard for certain required lab protocols.

“Just because it’s DNA doesn’t mean it’s good science.”[4]

-American biologist and the founder of the Idaho Innocence Project, Greg Hampikian

Misinterpretation of DNA edit

The nature of DNA forensics, in which a sample can include a mix of many potential suspects, makes it very difficult for analysts to distinguish them. In fact, one person's DNA can be found from a place they have never even visited [16]. This is caused by secondary transfer whereby human skin cells shed and get carried to different places by other people[17]. In 2013, Michael Coble, an American geneticist of the National Institute of Standards and Technology in Gaithersburg, carried out a scenario test which asked 108 laboratories if a particular DNA sample was part of the mix of DNA found on a ski mask from a particular crime scene. 73 laboratories inaccurately concluded that the DNA sample was part of the mix found from the mask [4]. These results clearly indicate how DNA evidence is afflicted by the analysts' discretion. Forensic science is therefore, highly circumstantial, meaning that it is subject to interpretation and by itself, cannot be treated as equivalent to scientific truths[17].

Overall Implications edit

A research report[18], published by the National Research Council states that the true value of forensics lies in the quality of the biological evidence collected at crime scenes and not necessarily solely on its scientific applicability. Thus, the value of evidence is heavily dependent on expert interpretation, as it does not come from scientific data, but rather from conclusions drawn from several possibilities derived by forensic science. Furthermore, psychological contamination of eyewitness accounts must be considered as it can drastically impact its credibility. It is a necessary and worthwhile task to think about the possible flaws inherent in different kinds of evidence and question professional consensus when convictions must be made in life-altering legal cases. Thus, the onus is on scientific and legal professionals to recognize and interpret the true 'value' of all evidence and adopt a holistic approach to evidence evaluation when concluding a legal case.

References edit

Evidence in Climate Change

Case study: Glaciers retreat in the Andes edit

Climate change has had many impacts on the outlines of our glaciers today, like in the Andes or in the Himalaya where ice land has been retreating over the last few decades.[19]

The Quelccaya Ice Cap (QIC) in the Andes, Perù, faces major changes in its amount of ice. Studies claim that minor changes in climate change are importantly linked to the changes in the ice cap's mass balance.[20] The landscape of the QIC dramatically changed since 1978 (Fig. 1):[19] qualitative evidence is here proof of the changes in geography. This retreat is due to an important rise of the Freezing Level Hight (FLH), which has approximately increased of 160 m the past six decades,[21] this increase due itself to global warming in the Andes [22] Quantitative evidence such as air temperature records on land (Ta) [23] can also support these variations. With the help of specific data methods,[21] experiments have measured Ta at the QIC summit. These studies have measured a Ta warming rate of 0.14 °C/decade over the periods 1979–2016 in the area.[21] These Ta anomalies have an influence on the FLH fluctuations of the QIC which triggers the loss of ice mass in the QIC. Global warming again has effects on the geographical frame of the Andes.[22]

Hence, as gla­ciers retreat, populations experience a short­fall in water supply. In response to that, more expenses have to be provided to reply to agricultural and living needs.[24] For instance, the Rio Orientales project in Perù is based on implanting a water tunnel and bring water from other further sources to reply to water needs in response to the glaciers retreat: these economic changes support the existence of climate change.

The example of glaciers retreat in the Andes illustrate that diverse types of evidence in different fields can support the existence of climate change.

Introduction to evidence edit

Evidence in geography edit

Because geography is the science that predominantly portrays our world through images, evidence holds a very delicate important role. Representing the spheric earth on a flat surface, thus creating a map is an ongoing challenge that begun centuries ago. Reading a map must be enlightened by reason and critical thinking because the map is also an effective instrument for creating representations which then evolve with history. For instance, the chosen projection alters the appearance of the map: distortions of distance, direction and scale, which question the importance of evidence in representing the world. Because there is so much evidence, thanks to satellites, and because maps are difficult to create, choices must be made. Indeed, the mapmaker may use different evidence than another maker as the map is the product of his choices. Two types of evidence exist in geography:[25] evidence from qualitative research and evidence from quantitative research. Qualitative evidence informs geography with the support of observations, opinions and unnumbered evidence. Quantitative evidence is informed by numbers, surveys and statistical information.

Evidence in Economics edit

In Economics – which can be seen as a counter-discipline to Geography –, the question of how reliable theories are, and whether the evidence that is used to establish these, is reliable, raises. Economics is described as either a social science or a natural science, however in the social sciences, evidence is often acquired through anecdotal evidence or testimonial evidence[26] which are rather subjective, hence the findings may be considered as less reliable by one. If economics is seen as a natural science, scientific evidence is being used to make assumptions. One issue that arises in economics is that often the evidence that exists does correspond with the theory behind it, per contra a further link between evidence and real life is difficult to establish.

Evidence in Climate Change edit

Cultural differences in the acquisition of evidence edit

In economics, different international viewpoints are considered and discussed when examining and evaluating the impacts of climate change – e.g. at the G20 summit where international political leaders debate about the impacts of climate change –, hence evidence can be interpreted and appraised differently across cultures; working together on the worldwide issue is essential. Cultural differences influencing evidence also appear in the natural sciences; one study carried out by Luncz,[27] investigating behaviour of Chimpanzees across cultures, emphasises the existence of varying evidence across cultures thus the various approaches towards elucidation of evidence. Referring this to the real world, cultural differences lead to differences in evidence, which suggest different approaches towards policy making in the scientific study of economics.

Economics of Climate Change edit

Climate Change reveals differing approaches to evidence from various disciplines. According to economists, climate change is an outcome of greenhouse-gas emissions leading to negative externalities of production hence creating costs that are not paid for by those who generate the emanations.[28] The Kyoto Protocol (1997), aimed at reducing greenhouse gas emission and[29] soon received feedback that it would harm economic growth. In 1997 Connaughton argues that the protocol would reduce output by up to $400billion in 2010, which is close to the calculations of the EIA from 2008 expecting a decrease in GDP by $397billion billion.[30] It appears that over a time period of 11 years, researchers were able to interpret data similarly; concluding the same. The scientific study of economics may therefore be considered as reliable hence truthful, as evidence shows that even when acquiring it at different times, the outcome is interchangeable.

Geography in climate change edit

Largely caused by humans in the burning of fossil fuels, climate change has many impacts on the outline of our world today, modifying our landscapes.[31] Because geography is a wide and resourceful discipline with sub-disciplines, its implication in the understanding of climate change is crucial. Geography indeed offers new understandings of the issue, varying from perceiving spatial dimensions of climate change to grasping the urban changes of global warming. Generating global warming, climate change covers transformations like rising seas and melting ice as well as extreme weather events which have consequences on the geographic world.[32] Climatology (itself classified within physical geography) addresses the issue through its focus on dynamic and statistical climatology.[33] Geography also aids in understanding the possible effects of climate change on environmental systems and societies. K. O’Brien and R. Leichenko suggest that there are winners and losers of climate change.[34] These winners and losers are divided regarding their geographical position: winners ‘will include the middle- and high- latitude regions, whereas losers will include marginal lands in Africa and countries with low-lying coastal zones’, hence showing how geography adds to the economic study of climate change and offers different perspectives.

Conclusion edit

Economics and geography work hand in hand in order to understand climate change. Geography, with its sub-disciplines, brings the basis of the scientific work in order to fully grasp the issue while economics focuses on its impacts and future and how it is apprehended by society. Various types of evidence used by both disciplines aid in the global understanding of climate change. Both disciplines are crucial in order to comprehend the issue and be able to live with it, and, to a certain extent fight it.

  1. Collins Dictionary of Law. W.J. Steward; 2006. Available from: [Accessed 4th December 2018].
  2. McLeod, S. What is Reliability? Available from: [Accessed 7 December 2018]
  3. Jenkins, L. Memory in the Real World: How Reliable is Eyewitness Testimony? Available from: [Accessed 7th December 2018]
  4. a b c Starr D. Forensics gone wrong: When DNA snares the innocent Science | AAAS. 2016 [cited 1 December 2018]. Available from:
  5. McLeod, S. Eyewitness Testimony. Available from: [Accessed 28th November 2018].
  6. a b c Mojtahedi, D. New research reveals how little we can trust eyewitnesses. Available from: [Accessed 28th November 2018].
  7. a b Mojtahedi, D., Ioannou, M. & Hammond, L. The Reduction of False Convictions. The Custodial Review. 2017; 81: 12. Available from: [Accessed 28th November 2018].
  8. Stambor, Z. How reliable is eyewitness testimony? Monitor on Psychology. 2006; 37(4): 26. Available from: [Accessed 28th November 2018].
  9. UKEssays. Relevance and Reliability of Eyewitness Testimony in Court. Available from: [Accessed 28th November 2018].
  10. Cobain. I. Killer Breakthrough - the day DNA evidence first nailed a murderer. The Guardian. 2016, Available from: [Accessed 3rd December 2018]
  11. Parven. K. Forensic use of DNA information: human rights, privacy and other challenges. University of Wollongong Thesis Collections. 2012. Available from:] [Accessed 3rd December 2018]
  12. Innocence Project. DNA Exonerations in the United States. 2018. Available from: [Accessed 3rd December 2018]
  13. Parven. K. Forensic use of DNA information: human rights, privacy and other challenges. University of Wollongong Thesis Collections. 2012. Available from:] [Accessed 3rd December 2018]
  14. a b Goldstein J. New York Examines Over 800 Rape Cases for Possible Mishandling of Evidence. The New York Times. 2013, Available from: [Accessed 1st December 2018]
  15. DNA Evidence: Basics of Identifying, Gathering and Transporting | National Institute of Justice. National Institute of Justice. 2012, Available from: [Accessed 4th December 2018].
  16. Papantonio M. Faulty DNA Evidence Is Causing False Convictions - The Ring of Fire Network. The Ring of Fire Network. 2018, Available from: [Accessed 4th December 2018].
  17. a b Rohrig B. Open for Discussion: How Reliable Is Forensic Evidence? - American Chemical Society. American Chemical Society. 2016, Available from: [Accessed 1st December 2018].
  18. Committee on Identifying the Needs of the Forensic Sciences Community, National Research Council (2009). Strengthening Forensic Science in the United States: A Path Forward. Washington, D.C.: THE NATIONAL ACADEMIES PRESS. Available at: [Accessed 1 Dec. 2018].
  19. a b Thompson, L.G., Mosley-Thompson, E., Brecher, H., Davis, M., León, B., Don, L., Lin, P.-N., Mash- iotta, T., and Mountain, K., (2006), Abrupt tropical climate change: Past and present, National Academy of Sciences Proceedings, vol. 103, p. 10536– 10543, doi:10.1073/pnas.0603900103.
  20. Stroup J. S., Kelly M. A., Lowell T. V., Applegate P. J., Howley J. A. (2015), Late Holocene fluctuations of Qori Kalis outlet glacier, Quelccaya Ice Cap, Peruvian Andes, Geology v. 42, p. 347–350, doi:10.1130/G35245.1
  21. a b c Yarleque C., Vuille M., Hardy D. R., Timm O. E., De la Cruz J., Ramos H., Rabatel A., (2018), Projections of the future disappearance of the Quelccaya Ice Cap in the Central Andes Scientific Reports vol. 8, Article number: 15564, doi:10.1038/s41598-018-33698-z
  22. a b Bradley R. S., Keimig F. T., Diaz H. F.,Hardy D. R. (2009), Recent changes in freezing level heights in the Tropics with implications for the deglacierization of high mountain regions Geophysical research letters, vol. 36, L17701, doi:10.1029/2009GL037712
  23. Diaz, H. F., and N. E. Graham, 1996, Recent changes in tropical freezing heights and the role of sea surface temperature, Nature, vol. 383, 152–155, doi:10.1038/383152a0.
  24. Vergara, W., A. Deeb, A. Valencia, R. Bradley, B. Francou, A. Zarzar, A. Grünwaldt, and S. Haeussling (2007), Economic impacts of rapid glacier retreat in the Andes, Eos Trans. AGU, 88(25), 261–264, doi:10.1029/2007EO250001
  25. Roberts M., (2010) What is “evidence-based practice” in geography education?, International Research in Geographical and Environmental Education, vol.19, article number: 2, pp. 91-95, doi: 10.1080/10382046.2010.482184
  26. Phil Howard, Types of Evidence, available from:, date of last access: 24th of October 2018
  27. Luncz L, Mundry R, Boesch C. Evidence for Cultural Differences between Neighboring Chimpanzee Communities. Lepizig; 2012 p. 1.
  28. Stern N. The Economics of Climate Change. Cambridge: Cambridge Univ. Press; 2007.
  29. Fletcher S. Global climate change: The Kyoto Protocol. Policy Papers; 2003 p. 1-4.
  30. Corbett J. Economics of Climate Change | [Internet]. 2008 [cited 28 November 2018]. Available from:
  31. Earth Science Communications Team at NASA's Jet Propulsion Laboratory, California Institute of Technology (2018), What’s in a name? Weather, global warming and climate change, available at:
  32. European Commission (2018), Climate change consequences available at:
  33. Aspinall R. Geographical Perspectives on Climate Change. Annals of the Association of American Geographers. 2010;100(4):715-718.
  34. O'Brien K, Leichenko R. Winners and Losers in the Context of Global Change. Annals of the Association of American Geographers. 2003;93(1):89-103

Evidence for the Implications of the Cow in Contemporary India

Introduction edit

Predominant confessions by district in India as revealed by the 2011 census. (Hindu-Purple, Muslim-Green, Christian-Blue, Sikh-Pink, Buddhist-Yellow, Others-Grey)

According to the 2011 census, 79.8% of Indians practice Hinduism, with only 6% practicing Buddhism, Jainism, and other faiths[1]. India's second largest religious population is Islam: 14.2% of Indians identified as Muslims, which equates to about 172 million people. Unlike Hindus and other observed religions in India, Muslims do not believe the cow to be sacred, meaning they continue to kill and eat the animal across the country. In the 2014 Indian election, the issue of the cow was more hotly debated than objectively more important problems such as corruption and women's safety.[2] In this chapter, we will address varying issues raised by the cow's status in India from within different disciplines.

Religious and Historical Background edit

A fundamental belief in Hinduism is that all living beings have souls and practicing non-violence to all creatures is the highest ethical value[3].

When the Indo-Aryan migration happened between 1900 BCE to 1400 BCE[4], cows were the primary domesticated animals and a necessary resource for the migrants, serving as both transport and food. The Indus Valley Civilizations started to gather in the Ganges River Basin where soil was fertile and weather was suitable for agriculture, causing the population to boom. Conflict began to break out as loss of forests and natural resources put a strain on the environment and lifestyles of communities. Upperclass citizens from Vedic continued to ignore the suffering of most of population, continuing to kill livestock to satisfy their own appetite. An Anti-Vedic trend challenged this sense of entitlement, opting to protect the welfare of cows and avoid killing them. This movement led to the genesis of Buddhism, the first non-violent religion in India. Despite their initial views, the upperclass from the Vedic readjusted their practices, choosing to protect the cow. This change in beliefs was the start of Hinduism. Similar ideas were also adopted by followers of the Jainist faith; as an ancient Indian religion, Jainism operates on the fundamental beliefs that violence against all living beings, including cattle, is wrong[5].

Varying views on cow veneration were crucial to the formation of India's major religions. Historically, the contrasting ideas put a strain on harmonious living, and this religious input continues to cause problems in the contemporary climate.

The Cow in Politics edit

India is a federal parliamentary democratic republic, with the Centre-Left Indian National Congress and Centre-Right Bharatiya Janata Party (BJP) forming the two main parties. Currently, BJP, led by Prime Minister Narendra Modi, is most largely represented both in national parliament and at the state level across the country.[6]

Historically, the cow has always been a polarising element of Indian politics. As early as 1870, Sikh sects in Punjab were organising cow protection movements, with the Hindu religious leader Dayananda Saraswati founding a cow protection committee a decade later in 1882.[7] Due to tension caused by contrasting religious views in India, conflicts over cow slaughter have provoked riots for over 120 years: in 1893, more than 100 people were murdered as a result of religious riots, with eight dying in 1966 following a demonstration outside parliament in Delhi when protesting a national ban on cow slaughter.[8] More recently, religious tension continues to cause problems, with cow vigilante violence targeting India’s Muslim population. Between 2010 and 2017, it was reported that there were 63 attacks[9] caused by tension relating to the sanctity of the cow in India.

This swelling has been attributed to the surge of Hindu Nationalism in India, influenced by the BJP’s election.[10] Prime Minister Narendra Modi has been vocal regarding his views on the importance of the cow, going as far as to promote using cow urine as a medicinal product.[11] In this way, he has encouraged many vigilante groups to continue fiercely protecting the cow, with many Hindu Nationalists claiming to feel “empowered” by Modi and his party.[12] After winning in 2014, the BJP has encouraged groups with strict laws concerning the cow: in Uttar Pradesh, India’s largest state, BJP chief officer Yogi Adityanath began his tenure by enforcing a strict lockdown on slaughterhouses.[9]

The political tension continues to evolve as the cow continues to be a huge campaign strategy for the BJP and Centre-Left Congress alike. The BJP are seeking to replace the tiger with a cow as the national animal of India.[13] Congress is also seeking to capitalise on cow welfare as their main campaigning tactic for drawing voters. In the state of Madhya Pradesh, Congress declared that each village within the state boundaries will have a cow shelter. In an attempt trump this, the BJP stated that cow ministry will be available to Madhya Pradesh residents.[14] It’s clear that the cow continues to be an incredibly vital yet polarising aspect of Indian politics.

The Cow in Industry edit

Fashion Industry edit

With India being a huge exporter of fashion goods for high street chains like Zara and fashion houses such as Armani alike, a 2017 crackdown on leather use proved a huge problem for many designers relying on Indian factories for their production. India is the world's second largest producer of footwear and leather garments, selling $13 billion worth of goods in the 2016 tax year.[15] The BJP ruled that using cows and buffalo for leather is strictly forbidden; the effects have proven devastating for factory workers across India, who face redundancy, as well as foreign companies. The leather industry is predominantly ran by India's Muslim population, and the clamp down has caused greater religious division and tension nationwide,[16] with Muslim workers risking their lives in illegal abattoirs to export leather and maintain small businesses.

Beef industry edit

In 2016–2017 165.4 million tonnes were created in India, the highest in the world. Additionally, India is predominantly dependent on bull power for agriculture and transportation. Hence the cow is seen as so valuable as it fulfils so many human needs in India, which leads to the second highest cattle population of 190 million.[17] However, this huge number of cattle causes societal unrest towards those who kill cows for meat that do not give any economic value anymore. Non-producing dairy cows and infertile cows get sold and end up in the slaughterhouses. Additionally, it is well known that any ban on slaughterhouses or the discontinuation of this industry would affect Muslims and Dalits the most, as these poorest mostly work in this industry. It would also create illegal slaughterhouses and unsafe labour environments, creating an even bigger wealth disparity. Even now, people that had anything to do with this industry are still murdered, beating up and publicly hung.[18] The Indian government wants to maintain its income and export on cow industries, yet not dressing the social and religious implications.

Conclusion edit

Given that the economic market is so big on using the parts of the cow, but also the political and religious unrest surrounding this topic, it is valuable to look at this issue from an interdisciplinary perspective. The ideal solution could be found by reusing non-producing cows in other manner such as fertilizing dung[19] and avoiding the slaughterhouse, decreasing the societal unrest.

References edit

Evidence in driverless cars

Evaluation of Evidence Within, Surrounding and In Consequence of Self-Driving Cars edit

This article examines how evidence is evaluated in self-driving cars from an interdisciplinary perspective encompassing: how self-driving cars use algorithms to collect and evaluate evidence (section 'within', disciplines: computer science and engineering), how policy-makers deal with risk and the uncertainty of evidence (section 'surrounding', disciplines: politics, statistics and psychology), and the role of evidence as an ethical entity (section 'in-consequence', discipline: ethics). A different, unique definition of evidence will be applied to each section in order to show the breadth of meaning of this concept.

The interior of Google's driverless car lacks most usual elements of a vehicle, illustrating the lack of human input required.

Within: Evidence and Bayes Theorem edit

Evidence, within a driverless vehicle, is defined as the continuous information gathered by cameras, radar and laser sensors from the surroundings. Algorithms are the central body, which process the data to perform reasoned actions. According to the SARTRE project a vehicle graded at Level 5 is fully autonomous in all driving modes, navigating entirely without human input.

Convolutional neural networks (CNNs) have been revolutionary in 'training' the algorithm in driverless cars, allowing them to learn automatically from training drives. CNNs use pixels from a front-facing camera to direct steering commands.[20]

This system operates largely on the basis of Bayes’ theorem.[21] Simply, Bayes’ theorem offers a systematic way to update one’s belief in a hypothesis on the basis of the evidence presented. For example, Google’s driverless cars use evidence from both Google Street View and artificial intelligence software.

Occasionally, the human operator is required to take driving control. A vehicle considered to be Level 3 can monitor its environment and drive with full autonomy under certain conditions, but not if sensors become damaged in challenging weather conditions.[22] Additionally, external data sources can oppose each other, but if the concepts of Evidentialism stand, each is justified in its recommendation to the driverless vehicle, if their evidence supports it.[23] To overcome this, the algorithm may re-direct the control to the human driver.

However, increasing reliance on automated systems could mean that humans will not maintain the skills to operate cars competently.[24] Consequently, although algorithms arose from computer science, their future role in driverless transportation is also relevant in the social and political disciplines.

Surrounding: Evaluating Evidence in Risk Assessment edit

The definition of evidence as 'that what justifies belief[25] illustrates the potential use of evidence in informed policy-making, where often the decision is justified by the assessment of potential risk.

Uber self-driving car showing damage after crashing into a pedestrian, reported by the "National Transportation Safety Board"

The cases of human deaths in the crashes of self-driving cars[26] show that their development and implementation can pose safety questions. These questions are investigated through risk assessment, which involves collecting and evaluating evidence on the variety of possible hazardous events and the probability of their occurrence.[27]

Human evidence evaluation in statistics can be seen in the "analytic system" and the "experimental system", utilized in risk assessment. The former uses normative rules (including statistics and formal logic), the latter uses emotion (including associations and experiences), although the "analytic system" requires guidance of the "experimental system".[28] Subsequently, programmers might be considered using their "experimental systems" to decide, for example, how the algorithm should react to certain situations (see 'in consequence' section). The algorithm and the evaluation of evidence (e.g. data) as the "analytic system" collaborate with the programmer's "experimental system".

Limitations in obtaining and evaluating evidence edit

Psychological factors affect the evidence evaluation performed by humans, who consequently make predictions and form policies. The perception of self-driving cars is connected to the emotions towards this innovative technology.[9] Therefore, evidence is important to inform opinions. Another concern is the possible access of third-parties to personal information compiled by self-driving cars.[29] The continuous data that would be gathered regarding the surroundings is likely used as it is collected in public. This contributes to privacy concerns and negative feelings towards the technology.[30]

Statistics can define observed data as evidence and evaluate data.[31] Evidence about fatalities and injuries using self-driving vehicles is hard to obtain as the vehicles have not driven sufficient miles to provide clear statistical evidence.[32] Miles driven does not correlate clearly to fatalities and injuries, so the cars need to complete hundreds of millions of miles to provide reliable evidence.[32] The limitations of obtaining and evaluating evidence show that it might not yet be feasible to demonstrate safety and uncertainty might remain, which affects policy-making.[32]

Approaches to uncertain evidence edit

One approach to deal with the uncertainty of evidence in policy-making is the precautionary principle. The meaning can be reduced to adopting measures to avoid harm to human health and the environment, even if these are not confirmed with data.[33] For example, in the USA NHTSA safety standards, it is assumed that a human driver should always be able to control the actions of motor vehicle in order to ensure its safety.[34]

However, in its extreme sense, the precautionary principle could lead to restraining from taking any action.[33] A more moderate approach is represented by adaptive regulations, which create new evidence (e.g through pilot experiment) and review it in order to adapt to the evolution of technology.[32] In case of autonomous vehicles, the adaptive regulations might become a mediator in negotiation between risk and progress, as experience and technological change will inform safety deliberations.[32]

In Consequence: Evidence as An Ethical Entity edit

Evidence in relation to ethics can be defined as mediating outcomes of driverless cars operating in accident scenarios used to determine how algorithms should react.

Programming autonomous cars requires addressing of dilemmas where the algorithms must make decisions in no-win situations or trolley problem premises, choosing which people involved are implicated, perhaps harmfully. One concern relating to these decisions is whether autonomous cars should act in the interest of the passengers or society. Although these are philosophical thought premises, they help determine how the algorithms will react in accident scenarios where collisions are unavoidable.[35]

A Waymo self-driving car on the road in Mountain View

There is, however, no evidence to suggest which reaction is the best way for a self-driving car to respond. From an utilitarian economic perspective, it should be to maximise total social benefit, hence resulting with the accident incurring the least total cost. From an engineering perspective, optimisation of machine functions and decisions outweighs ethical and legal considerations.[36] From a law standpoint, optimisation of an algorithmic decision to kill is unjustifiable and indefensible.[37] An interdisciplinary outlook must be applied as there are many conflicts in interests and little evidence to suggest a clear prioritisation of factors in these trolley problems.

Societal cultural values, which differ across nations, shape the normative ethical beliefs of individuals within those societies.[38] Studies on a range of countries have demonstrated varying opinions on the implementation of autonomous cars[39], revealing differences in ethical considerations. The validity of evidence is dependant on the desired outcome and desired outcomes will vary. There is a lack of real world evidence to guide a resolution amongst these variations in normative ethical ideas as autonomous cars are relatively untested.

Conclusion edit

This article has analysed how evidence is evaluated in both practical settings and in an abstract, emotional form: relating to the disciplines of computer science, engineering, statistics, psychology, ethics, and politics.

Bibliography edit

The Commercialisation of Social Media and its Impact on Truth

The growing use of social media platforms has changed the consumption of information. Users increasingly follow influencers and brands, which have vested commercial interest in users interactions. This impedes the access to reliable truth, as shared information can function to support an economic agenda, which blurs the lines between absolute truth and biased information. This chapter will explore how commercial interests and human behaviour interact to distort truth on social media platforms.

Social media as a contemporary commercial propaganda edit

Image showing the difference between disinformation and misinformation.

It is argued that commercial propaganda, usually presented in the form of journalism, is one of the four main propaganda streams, and plays a significant role in publicity, advertising and public relations.[9] Historically, commercial entities have used the media to manipulate the truth. Before social media, the most common propaganda instrument was newspapers, in which 70% of revenue came from selling advertisements, rather than conveying objective information.[9] In the 2008 subprime crisis, the distribution of the "All is Well" message from Wall Street via social media continually misled readers to take more risks. This was disinformation intended to protect profits but consequently brought about adverse effects for believers.[40] Thus, the use of media forms by commercial entities to protect and increase their profits can be seen through history and is not isolated to social media.

In fact, the influence of the media in general on the public access to reliable truth is well researched; truths can be influenced by journalism, as the commercial potential of an article forms the primary foundation of selecting information to be publicized.[41] Journalistic perspective also acts as a barrier to truth as there are often discrepancies in the information given to the public in contrast to information in its original, unedited form. Currently, social media is our dominant information source, so it is logical to infer that if traditional media impacts our access to reliable truth, it can also be impacted in the subcategory of social media.

Social media as an economically viable strategy edit

The use of influencers and interactions by users creates a sense of credibility, regardless of whether there is any truth in the information shared. This de-commercialised marketing strategy allows commercial messages to be seen as trustworthy, as “trust in social media is synonymous with credibility and reliability".[42] This suggests that it may be harder for a user to distinguish what information has been shared by a brand through influencers, or what has been shared by influencers themselves as actual truth. At present, there is a notable increase in the number of corporations investing in social media marketing strategies, due to the positive impact on customer perception of brands, as well as the number of people using social media platforms to make purchase decisions.[33] Some social media campaigns have shown an 83% net return on investment, and 40% increases in their revenue growth rate. This economic perspective shows that commercialisation in social media is evident and plays an integral role in business.[34] Because of this, there is a concern that this increased interest from businesses in social media has contributed to the growing problem of ‘misinformation’; a term that was appointed word of the year 2018.[43]

Social media and human behaviour edit

The consumption of products on social media platforms is influenced by factors other than absolute truth. Some people are more likely to be persuaded by the portrayal of a product in images, rather than the quality of the product itself.[44] If the images are seen as perceived truth, and product quality as a form of absolute truth, this shows that absolute truth is not what always drives a consumer’s decision-making process.This is further emphasised by the finding that factual advertising can, in fact, decrease purchase potential due to scepticism when faced with empirical information.[45] Here, the truth is not only less persuasive than other information sources but also acts as a hindrance to the aims of the information sharer. This provides an explanation as to why there is motivation for corporations to share information which diverges from the truth, as it does not always lead to the outcome that the information sharers desire.

Logo of popular social media platform 'Instagram' inside of a scroll.

Inter-user communication influences the perception of brands to a greater extent than information shared by the corporations behind said brands.[46] This user interaction can often cause misinformation to spread across a platforms. False news spreads both faster and further than correct information due to the high emotional response exhibited in comparison to verified information, this is because it ties to engaging topics that are relevant to current events.[47] Commercial entities have taken advantage of this by making advertisements trend specific, as well as utilising influencer marketing to make their specific message relevant to users.

In addition to this, the constant stream of information on social media increases the frequency that people interact with information increases its circulation and efficacy. Most young people have an active presence on multiple social media sites and check their phone over 70 times a day.[32] This presence increases the likelihood of exposure to the same information. The application of algorithms on social media platforms, as well as cross-platform marketing strategies also increase exposure, thus increasing trust in the information provided. This persistence of information means users as less likely to check facts, meaning that the user will most likely accept the information given. This leads to the reliance on confirmation bias, in which the user uses logic to deduce whether the information given fits into what they already believe to be true. This bias increases the spread of misinformation, as users then choose not to seek out whether something is true or not.[48]

However, this approach to understanding how commercialisation adversely impacts the truth of content on social media platforms acts on the assumption that without these vested interests, social media would be a space free of misinformation. This assumption is problematic, as opinions and information, both real and false, are spread due to the platforms communicative nature. There must be an understanding from users that they are potentially being manipulated through the content that they receive via social media and that they are responding to opinion rather than fact[32]. Therefore, the spread of misinformation does not exclusively result from the spreading of biased opinions by commercial entities, but also the users' behaviour themselves.

Conclusion edit

In order to understand how commercial interests are successful in impacting the truth of information on social media platforms, it is important to view it through the lens of history, economics and human behaviour. The spread of biased information via social media functions to support an economic agenda, which brings returns for companies who engage in this practice. Therefore, it is understandable why commercial entities utilise human behaviour patterns to manipulate users for commercial gain. History shows us that the circulation of biased information which impacts truth is not new, which suggests that the lines between impartial information and biased advertising will continue to be blurred. Thus, to combat the adverse impact of commercialisation on truth, it may be that the responsibility lies with the user to be aware that social media is commercialised and to fact-check information for confirmation of truth.

References edit

  1. Press Information Bureau Government of India Ministry of Home Affairs. 25 August 2015. RGI releases Census 2011 data on Population by Religious Communities. Available from:
  2. Staples J. Appropriating the Cow: Beef and Identity Politics in Contemporary India. In: Bhushi K, editor. Farm to Fingers: The Culture and Politics of Food in Contemporary India. Cambridge: Cambridge University Press; 2017. p. 58–79.
  3. Marvin Harris. India's sacred cow, Anthropology: contemporary perspectives. 6th edition, Editors: Phillip Whitten & David Hunter, Scott Foresman, ISBN 0-673-52074-9, 201–204
  4. Axel Michaels. 2004, Hinduism. Past and present, Princeton, New Jersey: Princeton University Press. 32-36. Available from:
  5. Susan J. Armstrong, Richard G. Botzier, 18 November 2016. The Animal Ethics Reader. ISBN 978-1-317-42197-9. Available from: p. 44.
  6. Graham BD. The Jana Sangh as a Hindu nationalist rally. In: Hindu Nationalism and Indian Politics: The Origins and Development of the Bharatiya Jana Sangh. Cambridge: Cambridge University Press; 1990. p. 94–157. (Cambridge South Asian Studies).
  7. Bauman, Chad M. Pentecostalism in the Context of Indian History and Politics. Oxford University Press, 2015. Available from:
  8. Soutik Biswas, BBC News [Internet]. Why the humble cow is India's most polarising animal, 15 October 2015. Available from:
  9. a b c d e Amy Kazmin. Indian PM distances himself from cow vigilante attacks. Financial Times. 17 July, 2017. Available from: Invalid <ref> tag; name ":0" defined multiple times with different content
  10. Soutik Biswas [Internet]. Why stopping India's vigilante killings will not be easy. 10 July 2017. Available from:
  11. Hugh Tomlinson. Indians told cow urine is health drink. The Times, 23 March 2018. Available from:
  12. Iain Marlow and Bibhudatta Pradhan [Internet]. Cow-Saving Vigilantes are a Sign of Political Rising Political Risk in India. 20 April 2017. Available from:
  13. Parihar, Rohit. “Gau Rakshika from Rajasthan Wants Modi to Declare Cow India's National Animal.” India Today, 2017.Available from:
  14. “Despite Four Agencies on Bovine Welfare, Rajasthan to Set up Cow Ministry Soon [Politics and Nation].” The Economic Times, New Delhi, 2014. Available from:
  15. 9. Global fashion giants fret over India cow crackdown: industry. Eastern Eye 2017 Jun 02(1407):17.
  16. “How Many Ways Can You Skin a Cow? In Hindu India, Plenty --- Thriving Leather Industry Relies On Muslims, `Fallen' Cattle; Next, a Bovine Pension Plan?” Wall Street Journal, by By Daniel Pearl, New York, N.Y., 2001, p. B.1. Available from:
  17. [ “Role of Livestock in Indian Economy.” Scheduled Tribes in India – Vikaspedia]
  18. Alam, Afroz “'Cow Economics' Are Killing India's Working Class.” The Huffington Post,, 22 June 2017
  19. Hedge, Narayan. 1995, Economic Gains as Primary Considerations against ban on Cow Slaughter. Yojana (Marathi) Dec. Vol. : 23-27.
  20. Bojarski M, Del Testa D, Dworakowski D, Firner B, Flepp B, Goyal P, Jackel LD, Monfort M, Muller U, Zhang J, Zhang X. End to end learning for self-driving cars. arXiv preprint arXiv:1604.07316. 2016 Apr 25. 1-4. Available from [Accessed 7th December 2018].
  21. D'Agostini G. A multidimensional unfolding method based on Bayes' theorem. Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment. 1995 Aug 15;362(2-3):487-98, Available at:. [Accessed 5th December 2018].
  22. Paden B, Čáp M, Yong SZ, Yershov D, Frazzoli E. A survey of motion planning and control techniques for self-driving urban vehicles. IEEE Transactions on intelligent vehicles. 2016 Mar;1(1):33-55. Available at [Accessed 1st December 2018].
  23. Feldman R. Evidentialism, higher-order evidence, and disagreement. Episteme. 2009 Oct;6(3):294-312. Available at:. [Accessed on 25th November 2018].
  24. FRY, H. (2018). HELLO WORLD. [S.l.]: Doubleday, pp.122-125. [Accessed in 1st November 2018].
  25. Kelly T. Evidence. The Stanford Encyclopedia of Philosophy. Winter 2016 ed. 2016. Available from: [Accessed 9 December 2018].
  26. Burns L, Shulgan C. Autonomy. The Quest to Build the Driverless Car – And How It Will Reshape Our World. 1st ed. Hypercollins; 2018.
  27. Ostrom L, Wilhelmsen C. Risk assessment. Tools, Techniques, and Their Applications. Hoboken, New Jersey: John Wiley & Sons, Inc.; 2012.
  28. Slovic, P., Finucane, M., Peters, E. and MacGregor, D. Risk as Analysis and Risk as Feelings: Some Thoughts about Affect, Reason, Risk, and Rationality. Risk Analysis. 2004;24(2): 311-322. Available from: [Accessed 3rd December 2018].
  29. Elmaghraby, A. S., Losavio, M. M. Cyber security challenges in Smart Cities: Safety, security and privacy. Journal of Advanced Research. 2014;5(4): 491-49. Available from: [Accessed 25th November 2018].
  30. Bloom C, Tan J, Ramjohn J, Bauer L. Self-driving cars and data collection: Privacy perceptions of networked autonomous vehicles. In: SOUPS '17: Proceedings of the 13th Symposium on Usable Privacy and Security, July 2017. USENIX. 2017. Available from: [Accessed 1st December 2018].
  31. Royall, R. On the Probability of Observing Misleading Statistical Evidence. Journal of the American Statistical Association. 2000;95(451): 760-768. Available from: [Accessed 25th November 2018].
  32. a b c d e f g Kalra, N., Paddock, S. M. Driving to safety: How many miles of driving would it take to demonstrate autonomous vehicle reliability? Transportation Research Part A: Policy and Practice. 2016;94: 182-193. Available from: [Accessed 25th November 2018]. Invalid <ref> tag; name ":1" defined multiple times with different content
  33. a b c Gardiner S. A Core Precautionary Principle. Journal of Political Philosophy. 2006;14(1):33-60. Invalid <ref> tag; name ":3" defined multiple times with different content
  34. a b U.S. Department of Transportation. Preparing for the Future of Transportation: Automated Vehicles 3.0. U.S. Department of Transportation;2018: 6-7. Invalid <ref> tag; name ":4" defined multiple times with different content
  35. Nyholm S, Smids J. The Ethics of Accident-Algorithms for Self-Driving Cars: an Applied Trolley Problem?. Ethical Theory and Moral Practice. 2016;19(5):1275-1289. Available from: [Accessed 8th December 2018].
  36. Gogoll J, Müller J. Autonomous Cars: In Favor of a Mandatory Ethics Setting. Science and Engineering Ethics. 2017;23(3):681-700. Available from: [Accessed 7th December 2018].
  37. Coca-Vila I. Self-driving Cars in Dilemmatic Situations: An Approach Based on the Theory of Justification in Criminal Law. Criminal Law and Philosophy. 2018;12(1):59-82. Available from: [Accessed 7th December 2018].
  38. Chatterjee S, Tata R. Convergence and Divergence of Ethical Values across Nations: A Framework for Managerial Action. Journal of Human Values. 1998;4(1):5-23. Available from: [Accessed 8th December 2018].
  39. Kyriakidis M, Happee R, de Winter J. Public opinion on automated driving: Results of an international questionnaire among 5000 respondents. Transportation Research Part F: Traffic Psychology and Behaviour. 2015;32:127-140. Available from: [Accessed 8th December 2018].
  40. Donald G. Business propaganda buries truth and threatens democracy. CCPA Monitor. 2018;16(6):16-18
  41. Winsten, J. Science and the media – the boundaries of truth. Health Affairs [online] 1985
  42. Bryson R. The Importance of Trust on Social Media [online]. The Conference Board of Canada. 2017 [accessed 4 December 2018]. Available from:
  43. Italie, L. chooses ‘misinformation’ as word of the year. The Columbian [online] 2018 [accessed 30.11.18]. Available at: dictionary-com-chooses-misinformation-as-word-of-the-year
  44. Snyder, M. & DeBono, K. Appeals to image and claims about quality: Understanding the psychology of advertising. Journal of Personality and Social Psychology. [online] 1985 [accessed 23.11.18] 49(3), 586-597. Available at:
  45. Koslow, S. Can the truth hurt? How honest and persuasive advertising can unintentionally lead to increased consumer scepticism. The Journal of Consumer Affairs. [online] 2005 [accessed 23.11.18] 34(2) 245-267. Available at:
  46. Schivinski, B. & Dabrowski, D. The effect of social media communication on consumer perceptions of brands. Journal of Marketing Communications [online] 2014 [accessed 23.10.18] 22(2), 189-214. Available at:
  47. Vosoughi et al. The spread of true and false news online [online]. American Association for the Advancement of Science; 2018 [accessed 28.11.18] p. 1146–1151. Available from:
  48. Ecker U. Why rebuttals may not work: the psychology of misinformation. Media Asia [online]. 2017 [accessed 28.11.18];44(2):79-87. Available from:

Truth in the Diagnosis of Depression

Introduction edit

More people are being diagnosed with depression. A meta-analysis of American students concluded that, between 1938 and 2007, diagnosed cases increased up to eight-fold[1]. In fact, 25% of the world's population will be diagnosed with depression in their lifetime. Is this a result of the changing environment or, rather, simply the consequence of the redefinition of what constitutes depression? The WHO is set to make the 11th revision of the ICD, representing a challenge, and posing the question of whether new dimensions discovered by research in that decade will lead to the addition of new mental disorders, or just the redefinition of old ones.

Since psychology is a discipline which can be observed through both empirical and social science methodologies, looking at the truth in the diagnosis of depression through different disciplines allows for a comprehensive understanding of the difficulty of restricting the parameters of diagnosis, and accounting for the variations in depression, hence guiding research in how to tackle this issue globally.

Diagnosing Depression edit

Diagnosing depressive disorders is a complex process which currently depends on a social science methodology. Physicians consider DSM-V criteria to be integral indicators whether a patient may suffer from clinical depression. Additional screenings can involve interviews, or self-report questionnaires, such as the Beck Depression Inventory and Zung's self-rating depression scale .

These methods are problematic as somatic symptoms aren't unique to depressive disorders. Furthermore, some patients prioritise physical symptoms over emotional distress, leading to false negative diagnoses[2]. Conversely, some sources affirm that we are over-diagnosing depression[3][4]. A study assessing diagnoses by GPs during routine examinations found out that per 100 people, only 10 out of 35 people were given the correct diagnosis[5]. This shows that the methods used by physicians fail at providing accurate diagnoses, and new analytical processes are required to uncover the true ubiquity of depression.

Surprisingly, suggestions for alternatives are limited; one was to ask patients whether they would like help immediately with two highly sensitive screening questions[6]. This improved the validity of the findings[6], but failed to acknowledge contributing factors like genetic dispositions. Another suggested option is Bayesian network modelling [7][8][9]. This is a more complete process, as various prior observations e.g. genetics can be accounted for, along with the patient's symptoms[10]. Employing this could also let us differentiate between the different forms of depression, as the relative contributions of different variables to the diagnosis statistic highlight the driving factors behind the disorder, overall providing more truthful diagnoses of the illness.

Biology and Depression edit

The aforementioned diagnostic techniques rely heavily on methodologies which study the consequences rather than the causes of depression, while the causes, such as biological predisposition, can be studied empirically, giving us a more comprehensive picture.

Neural circuits signalling the difference between normal brain function (left) and brain function of those with mutated alleles of the 5 HTT gene (right)

More than 10 different genes are currently suspected to play a role in causing depression[11]The diathesis-stress model[12] shows the direct positive correlation between a series of mutations in the 5 HTT gene and stressful life events leading to depression. The mutated allele is responsible for a modified enzyme, preventing the cells from producing as much serotonin as they should.

The issue with this approach is the limited knowledge we have on both neurotransmission and genetic predisposition. Studies like Caspi's are correlational and rely on self-reporting. Nonetheless, we can safely state that depression is partially derived from chemical imbalances or genetic abnormalities, and can be treated through medication. Researching essential biological factors behind this disorder will help us improve our means of treatment.

Social Variations in Depression edit

If the truth of diagnosis relied solely on the biological predisposition, there would be albeit an unrefined but sure tool to extract it. However, through the examination of the socially constructed variations in the experience of depression, using social sciences can explain how cultural difference contributes to the ambiguity of diagnosis.

Cultural Differences in Depression edit

The variation in diagnostic criteria between cultures is staggering; they have been shown to exhibit psychological phenomena in different ways, therefore modifying their classifications of depression and its diagnosis[13].

When using each respective country's data, major differences can be seen in the rates of prevalence between Eastern and Western nations. Asian countries, particularly China and Japan, had lower rates[14]. However, when the United States' criteria were applied universally, similar prevalence rates were observed. The difference is due to their holistic views on society which give depression a taboo status. Therefore, doctors are more likely to diagnosis neurasthenia as it focuses more on somatic symptoms, removing the mental illness stigma[15]. This proves that when looking at depression and its diagnosis, cultural bias is important. This is why social sciences and anthropology can help us understand how the illness is perceived and aid us in tackling it.

Gender Variations in Depression edit

Virginia Woolf lived with a different definition of depression

Gender is another element that brings about great variation in depression's preponderance, as the lifetime prevalence in women is approximately double[16] that of men, despite male suicides being 3-4[17] times more probable. George Brown developed a social vulnerability[18] model that states that low protective factors, high-risk factors and providing agents increase susceptibility to depression. The role-strain hypothesis'[19] interactionist approach suggests that social roles and cultural factors (patriarchal norms), along with gender differences in biological responses to stressors, increase female vulnerability to depression.

Historical Perspective edit

Assuming that biological predisposition is an independent variable, further isolation of the social factors could be carried out by viewing depression from a historical perspective. Take, for example, Virginia Woolf; understanding of mental illness was significantly different in her lifetime[20]. Judging the past by modern day standards is a prominent criticism of the historical methodology and, in this case, it raises many questions of whether or not the ‘truth’ of Woolf’s illness had been experienced differently without a cognitive label.

Different Truths in Depression edit

This often-contradictory evidence raises an interesting question; is there a need to search for a universal truth in the diagnosis of depression?

The WHO describes depression as “the leading cause of disability worldwide”[21], creating the need to tackle depression worldwide, yet we cannot utilise typical treatment methods used globally for physical disorders.

If seen from the angle of biological and statistical analysis, we can observe a degree of universality in the illness[22] which might benefit from a holistic approach, and hence a more concise deductive definition of the illness. However, by considering and the cultural, historical, and gender variations, we can infer that a universal definition would homogenise and thus actually distance us from the truth.

Philosophical theories of truth are valuable tools for framing arguments for either point of view. For example, correspondence theory asserts that truth’s existence hinges upon its correspondence to fact, where fact is defined as existing within a set of properties and universals. Coherence theory, contrarily, argues that truth only exists within a coherent belief system, which validates the existence of multiple 'truths'[23].

These differences appear to exist in contradiction, much like the nature-nurture debate. However, by looking at the issue using different theories, methodologies, and disciplines, we can actually establish a more efficient functioning framework for the diagnosis of depression, which will both illuminate areas of further research and serve as a reference point for existing theories.

Notes edit

  1. Hidaka B. Depression as a disease of modernity: Explanations for increasing prevalence. Journal of Affective Disorders [Internet]. 2012 [cited 22 November 2018];140(3):205-214. Available from:
  2. Goldman, L., Nielsen, N. and Champion, H. Awareness, diagnosis, and treatment of depression. Journal of General Internal Medicine [Internet]. 1999 [cited 29 Nov. 2018]; 14(9):569-580. Available from:
  3. Aragones, E. The overdiagnosis of depression in non-depressed patients in primary care. Family Practice [Internet]. 2006 [cited 29 Nov. 2018]; 23(3):363-368. Available from:
  4. Parker, G. Is depression overdiagnosed? Yes. BMJ [Internet]. 2007 [cited 29 Nov. 2018]; 335(7615):328-328. Available from:
  5. Mitchell A, Vaze A, Rao S. Clinical diagnosis of depression in primary care: a meta-analysis. The Lancet [Internet]. 2009 [cited 30 November 2018];374(9690):609-619. Available from:
  6. a b Arroll, B., Goodyear-Smith, F., Kerse, N., Fishman, T. and Gunn, J. Effect of the addition of a “help” question to two screening questions on specificity for diagnosis of depression in general practice: diagnostic validity study. BMJ [Internet]. 2005 [cited 28 Nov. 2018]; 331(7521):884. Available from:
  7. Sumathi, M. and Poorna, B. A Bayesian Framework for Diagnosing Depression Level of Adolescents. International Conference on Computing and Intelligence Systems [Internet]. 2015 [cited 29 Nov. 2018]; 04(2):1350-1354. Available from:
  8. Hagmayer, Y. and Engelmann, N. Causal beliefs about depression in different cultural groups — what do cognitive psychological theories of causal learning and reasoning predict?. Frontiers in Psychology [Internet]. 2014 [cited 29 Nov. 2018]; 5(1303). Available from:
  9. Ojeme, B. and Mbogho, A. Predictive Strength of Bayesian Networks for Diagnosis of Depressive Disorders. Intelligent Decision Technologies 2016 [Internet]. 2016 [cited 29 Nov. 2018]; 373-382. Available from:
  10. Shojaei Estabragh, Z., Riahi Kashani, M., Jeddi Moghaddam, F., Sari, S., Taherifar, Z., Moradi Moosavy, S. and Sadeghi Oskooyee, K. Bayesian network modeling for diagnosis of social anxiety using some cognitive-behavioral factors. Network Modeling Analysis in Health Informatics and Bioinformatics [Internet]. 2013 [cited 29 Nov. 2018]; 2(4):257-265. Available from:
  11. Neumeister A, Young T, Stastny J. Implications of genetic research on the role of the serotonin in depression: emphasis on the serotonin type 1A receptor and the serotonin transporter. Psychopharmacology [Internet]. 2004 [cited 5 December 2018];174(4):512-524. Available from:
  12. Caspi A, Sugden K, Moffitt T, Taylor A, Craig I, Harrington H et al. Influence of Life Stress on Depression: Moderation by a Polymorphism in the 5-HTT Gene. Science [Internet]. 2003 [cited 2 December 2018];301(5631):386-389. Available from:
  13. Hofstede G. Culture's consequences. 5th ed. Newbury Park: Sage; 1984.
  14. Ferrari A, Charlson F, Norman R, Patten S, Freedman G, Murray C et al. Burden of Depressive Disorders by Country, Sex, Age, and Year: Findings from the Global Burden of Disease Study 2010. PLoS Medicine [Internet]. 2013 [cited 8 December 2018];10(11):e1001547. Available from:
  15. Neurasthenia – an overview | ScienceDirect Topics [Internet]. 2018 [cited 8 December 2018]. Available from:
  16. Kuehner C. Gender differences in unipolar depression: an update of epidemiological findings and possible explanations. Acta Psychiatrica Scandinavica [Internet]. 2003 [cited 8 December 2018];108(3):163-174. Available from:
  17. Värnik P. Suicide in the World. International Journal of Environmental Research and Public Health [Internet]. 2012 [cited 8 December 2018];9(3):760-771. Available from:
  18. Brown G, Harris T. Social origins of depression. 30th ed. London: Tavistock Publ.; 1984.
  19. Nolen-Hoeksema S. Gender Differences in Depression. Current Directions in Psychological Science [Internet]. 2001 [cited 8 December 2018];10(5):173-176. Available from:
  20. Gans S, MD. When Were the Earliest Accounts of Depression? [Internet]. Verywell Mind. [cited 2018 Dec 9]. Available from:
  21. Depression [Internet]. [cited 2018 Dec 9]. Available from:
  22. Flint J, Chen Y, Shi S, Kendler K. Epilogue: Lessons from the CONVERGE study of major depressive disorder in China. Journal of Affective Disorders [Internet]. 2012 [cited 8 December 2018];140(1):1-5. Available from:
  23. Glanzberg M. Truth. In: Zalta EN, editor. The Stanford Encyclopedia of Philosophy [Internet]. Fall 2018. Metaphysics Research Lab, Stanford University; 2018 [cited 2018 Dec 9]. Available from:

Subjective and Objective Truth in AI

Objective Truth in AI edit

Artificial intelligence (AI) is often thought to make objective decisions easier. Here, objectivity refers to conclusions based on critical thinking and scientific evidence, where the conclusion is indisputable and there is only one true answer[1]. Made up of formula and algorithms, AI can process vast amounts of data to come to a conclusion that is significantly more accurate, and therefore objective, than a human can achieve[2].

An example of this is machine learning and the task of identifying subjects in pictures. Though simple for humans, AI needs repetitive training with massive amounts of data to tell the difference between drinks, or a table and a stool. Neural networks in AI begin with one or multiple inputs, such as a picture, and processes them into one or multiple outputs, such as whether the picture shows wine or beer. These outputs consist of a complexity of ‘neurons’ that are grouped into layers, where one layer interacts with the next layer through weighted connections – each neuron carries a value, which is multiplied with the neuron in the consequent layer.[3]  Bias functions such as Eθ(θˆ) − θ[4], can be coded into the neural network and passed through the layers. As a result, inputs can be propagated through the whole network and the machine is taught to make predictions and draw conclusions that are as accurate as possible. This continual testing can reach decisions for extremely complex problems[5].

As used by Accenture in their teach-and-test framework for AI[6], the continual connectivity and data processing mentioned previously can be tracked, and decisions or conclusions reached by the AI system can be questioned. The AI can even be coded to justify the decisions it reaches[7]. This can provide peace of mind that the AI is achieving human-centred, unbiased and fair conclusions – objectivity.

Subjective Truth in AI edit

It is often argued, however, that the supposed objective decisions made by AI end up becoming subjective because the data sets being used are biased [8][9]. Here, subjectivity refers to a belief based on personal opinions, experiences and feelings and not on scientific evidence[10]. As human beings we all have our own biases, and no one can be truly objective [11]. As we are both creating the AI itself and the data it processes, it can be inherently implied that AI is never going to be objective.

Gender and ethnicity biases are often unconsciously inputted into algorithms. A notable example of this is AI facial recognition software identifying black women as men[12][13]. It is suggested that this is down to the unconscious bias of computer scientists and engineers, the majority of which are white and male[13]. Similarly, when searching for pictures on Google, the word ‘CEO’ will bring up pictures of men and the word ‘helper’ will bring up pictures of women[14]. This is based on biased data sets on what a CEO looks like. Most CEO’s are indeed men, but this is based on historical patriarchal ideas that are generally considered wrong[15].

As AI is becoming increasingly more prominent in everyday life; self-driving cars, Google home devices, advertising, and many more applications, ethics need to be considered. Ethics can be defined as means to tackle the question of morality[16], but ethics can be interpreted differently according to one's opinions, beliefs and perspectives, as a result trying to create AI that is ethical is likely to cause many problems. Especially when these decisions are coupled with potentially biased data [17].

Interdisciplinary Approach to AI edit

From a mathematical, objective point of view; AI provides significant computing and decision-making power that humans will never be able to accomplish on their own, achieving more of an insight into complex problems. From a subjective, ethical and philosophical stand point; AI will never be truly objective[18] and we’re likely to run into significant problems where AI ‘gets it wrong’, such as the 2010 Flash Crash, in its pursuit to find ‘the truth’ or to reach a logical conclusion[19][20].

As an example, AI could be used in recruitment to eradicate unconscious bias in hiring[21]. However, if a machine learning algorithm was used, data about gender, race, disability etc. could inform the AI to make decisions to hire white, straight, able-bodied men – who according to bias data are the least risky, and therefore, most cost-effective choice of employee[22]. It could easily highlight our own biases and amplify them[20]. And, because machine learning is done in itself, it is a black box – we input data and we get data out, without auditing the results, we could be completely unaware of what data points the AI was using to inform its decision[22].

AI struggles to be truly objective when presented with problems that have ethical questions tied to them[23]. However, evaluating AI from an interdisciplinary perspective ensures that there has been considered thought about the effects of AI and the decisions it has to make. Obviously, computer science and electronic engineering play a huge role in creating the technology, but philosophy and the social sciences such as anthropology, economics and psychology are needed in the development of AI to ensure we produce systems that ‘think’ about the other effects of its conclusions, making AI both useful and safe for humans to use in the future.

Notes edit

  1. Mulder, D. H, Objectivity [Internet]. Sonoma State University, California: Internet Encyclopedia of Philosophy; [updated 2004 Sept 9; cited 2018 Dec 9]. Available from:
  2. ICO, Big data, artificial intelligence, machine learning and data protection [Internet]. Cheshire, UK: Information Commissioner's Office; [updated 2017 May 17; cited 2018 Dec 9]. Available from:
  3. Marr, B., What Are Artificial Neural Networks – A Simple Explanation For Absolutely Anyone [Internet]. Forbes; [updated 2018 Sept 24; cited 2018 Dec 9]. Available from:
  4. Estimation, bias, and mean squared error [Internet]. Cambridge, UK: Statistical Laboratory; [updated 2018; cited 2018 Dec 7], pp.2. Available at:
  5. Luger, G. F., 'Foundations for Connectionists Networks'. In: Artificial Intelligence: Structures and Strategies for Complex Problem Solving. Essex: Pearson Education Limited; 2005. p. 455
  6. Bennink, J., Accenture Launches New Artificial Intelligence Testing Services [Internet]. Chicago: Accenture; [updated 2018 Feb 20; cited 2018 Dec 9]. Available from:
  7. Cathelat, B., 'How much should we let AI decide for us?' In: Brigitte Lasry, B. and Kobayashi, H., UNESCO and Netexplo. Human Decisions Thoughts on AI. Paris, France: UNESCO Publishing; 2018. p. 132-138. Available from:
  8. Vanian J., Unmasking AI's bias problem [Internet]. New York: Fortune; [updated 2018 Jun 25; cited 2018 Dec 2], Available from:
  9. Srinivasan, R., 'The Ethical Dilemmas of Artificial Intelligence' In: Brigitte Lasry, B. and Kobayashi, H., UNESCO and Netexplo. Human Decisions Thoughts on AI. Paris, France: UNESCO Publishing; 2018. p.107. Available from:
  10. Francescotti, R., Subjectivity [Internet]. Abingdon; Routledge Encyclopedia of Philosophy; [updated 2017 April 24; cited 2018 Dec 9]. Available from:
  11. Naughton, J., Don't worry about AI going bad – the minds behind it are the danger [Internet]. London: The Guardian; [updated 2018 Feb 25; cited 2018 Dec 4]. Available from:
  12. Lohr, S., Facial Recognition Is Accurate, if You’re a White Guy [Internet]. New York: The New York Times; [updated 2018 Feb 9; cited 2018 Dec 3]. Available from:
  13. a b Buolamwini, J. and Gebru, T. Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. JMLR. 2018. [cited 2018 Dec 3] 81:1–15. Available from:
  14. Devlin, H., AI programs exhibit racial and gender biases, research reveals [Internet]. London: The Guardian; [updated 2017 Apr 13; cited 2018 Dec 4]. Available from:
  15. Johnson, A. G., 'What is this thing called patriarchy?' In: The Gender Knot. Philadelphia, PA: Temple University Press; 1997. p. 6
  16. Mulder, D. H, Ethics [Internet]. Sonoma State University, California: Internet Encyclopedia of Philosophy; [updated 2004 Sept 9; cited 2018 Dec 9]. Available from:
  17. Bostrom, N. and Yudkowsky, E. 'The ethics of artificial intelligence'. In: The Cambridge Handbook of Artificial Intelligence. Cambridge: Cambridge University Press; 2011. p. 316-334. Available from: [cited 2018 Dec 9]
  18. Moor, J. H., 'The Nature, Importance, and Difficulty of Machine Ethics'. In: Anderson, M. and Anderson, S. L. Machine Ethics. New York: Cambridge University Press; 2011. p. 13. Available from: [cited 2018 Dec 3]
  19. Jøsang, A., Artificial Reasoning with Subjective Logic [Internet]. Trondheim, Norway: Norwegian University of Science and Technology; [updated 1997; cited 2018 Dec 3]. Available from:
  20. a b Newman, D., Your Artificial Intelligence Is Not Bias-Free [Internet]. Jersey City, New Jersey: Forbes; [updated 2017 Sept 12; cited 2018 Dec 3]. Available from:
  21. Lee, A. J., Unconscious Bias Theory in Employment Discrimination Litigation, Harvard Civil Rights-Civil Liberties Law Review. 2005. [cited 2018 Dec 3]; 40(2): 481-504. Available from:
  22. a b Tufekci, Z., Machine intelligence makes human morals more important [Internet]. Banff, Canada: TEDSummit; [updated 2016 Jun; cited 2018 Dec 3]. Available from:
  23. Polonski, V., The Hard Problem of AI Ethics – Three Guidelines for Building Morality Into Machines [Internet], Paris, France: The Forum Network, hosted by the OECD; [updated 2018 Feb 28; cited 2018 Dec 3]. Available from:

Truth and the Environment

Is there such a thing as a universal and fixed truth? People's different opinions and values shape their understanding of what they believe to be the truth. But if everyone claims with certainty that their truth is absolute, who is right?

Today, the environment is a polarising subject as people's contradictory views of the truth generate heated discussions. People have different truths and understandings of the world. Some people perceive climate change as a real issue, which is "a man-made disaster of global scale. Our greatest threat in thousands of years" [1]. They are worried about the planet, claiming that mass extinction will occur, that resources will not be able to cope with our needs, that we are experiencing global warming as ice caps are melting in the coldest parts of the world and temperatures are booming. Scientific evidence appears to prove the accuracy of these claims, but do we know for a fact that it is true?

The Facts edit

When looking at the facts, we see that the average temperature on the earth's surface has risen by 0.9 degrees Celsius since the 19th Century, and most of this increase has resulted from carbon dioxide emissions along with other greenhouse gases.[2] Oceans have also experienced a substantial increase in temperature, leading to the shrinkage of ice sheets at the poles and decreased snow cover, which has resulted in a rise of sea levels,[3] the alteration of the ocean's pH balance and an increase in its acidity by 30%.[4]

Despite accurate models and evidence provided, facts are not always a sufficient evidence of truth due to 'myths' surrounding the topic.[5] Knowledge about climate change leads to existential questions, such as "how can we combat these issues to avoid catastrophic risks for future generations?". Individuals may reject climate change in order to avoid facing responsibility, ultimately leaving the facts powerless.[6]

Manipulation Of Evidence Can Distort The Truth edit

Statistical reliability is vital whenever any data is being analysed. Numbers can't lie, but they can be used to lead someone towards a particular belief.

Misleading statistics edit

Misleading statistics is the wrongful exploitation of numerical evidence: the reader receives deceptive information and believes something false.[7] The evidence concerning the truth and occurrence of global warming is often speculated to have been manipulated, with scientific results being taken out of context in order to reduce the effects of global warming. According to NASA's Goddard Institute for Space Studies, the global mean temperature was 58.3 and 58.2 degrees Fahrenheit in 1998 and 2012 respectively. It can be logically deduced that, as there was a decrease in the global mean temperature, global warming is disproved. However, they used the results of the global air warming on a non-relevant time-frame: 1998 was one of the hottest years on record because of the El Niño wind and temperatures are typically measured with at least a 30-year cycle. When looking at the long-term data from 1900 to 2017, it paints a clear picture of gradual warming.[8]

Correlation vs causation edit

Another factor that leads people to have opposing beliefs is the problem of correlation versus causation; the occurrence of one is not necessarily linked to the occurrence of the other[9]. There is, however, a correlation between rising CO2 levels and the rising global temperatures. Even though there is a high degree of correlation between them, it might just be a coincidence of the numbers. This is what some sceptics choose to see. But those who do believe in climate change choose to see the evidence differently. They believe that the attribution studies have become so strong that there is no question that CO2 is the main driver of global warming.[10]

The occasional lack of clear causation accounts for the different truths that opponents and proponents of climate change claim.

Scepticism edit

The psychology of climate change denial edit

The human brain is not made to accept climate change because the consequences and rewards seem distant and the ability to act on an individual scale seems microscopic. For these reasons, it is a miracle that only 8% of the population does not believe in climate change.[11] So what motivates these individuals and why are sceptics so influential?

Why deny? edit

Economic causes edit

Scepticism[12] can be linked to the importance of fossil fuel industries for the economy and workers. Big companies that pollute a lot usually oppose climate change.[13] One of their main argument is a lack of evidence on the existence of global warming, claiming that the range of time we have been able to observe a rise of in temperatures is not long enough to draw conclusions. For example, Gina Rinehart, a large Australian mining company's CEO made a donation of 4,5 million dollars to the Institute of Public Affairs, an Australian think tank promoting climate change scepticism.[14] Media and politicians are often accused of being bribed to confuse the public’s perception on scientific evidences and promote false advertisements.[15] It can be hard to implement environmentally-friendly policies even when the population is receptive to climate change. One example is the 'Gilets Jaunes' movement taking place in France. As President Macron tries to increase taxes on diesel in an attempt to reduce greenhouse gases, he is being met by massive public demonstrations. For the public, the truth is that the government is reducing their consumer surplus by increasing taxes in order to balance the economy. As the Financial Times explained, the President is attempting "to reconcile the climate issues that 'evoke the end of the world' with the social needs of those who 'talk about the end of the month'" [16], an example of temporal discounting. Perhaps some people know the truth but keep it to themselves, choosing to conform because they are afraid, as shown in the Asch Conformity experiment [17].

Political causes edit

Climate change represents an economic threat. Many people in the current US government, including President Donald Trump [18], refute climate change. Conversely, prior US President Barack Obama was so concerned for the environmental impact of climate change that he participated in the documentary 'Before the Flood' stating that "if we keep pushing, there is no reason we can't solve this problem". Is it conceivable that the opposition to climate change is linked to politics? Could we speculate that Trump knows about the dangers of climate change but dismisses them because the US is a fossil addicted country and the biggest emitter of greenhouse gases in history? A recent study shows a strong link between climate change denial and right wing nationalism.[19] On this matter, the philosopher Theodor Adorno states “The conversion of all questions of truth into questions of power not only suppresses truth as with earlier despotic orders, but has attacked the very heart of the distinction between true and false.” [20]

Conclusion edit

Climate change is largely proven to be true by scientific research. Yet, doubts still persist. Powerful narratives of climate change denial resonate with a fraction of the population. Such narratives promote scepticism around the topic and the scientific evidence provided. Scientists need to embrace the fact that truth is a set of beliefs supported by a powerful narrative and not just cold hard facts. Saving our planet requires it.

  1. Matt McGraph. BBC. Sir David Attenborough: Climate Change 'our greatest threat'. Available from : [Accessed December 1st 2018]
  2. NOAA- National Centres for Environmental Information. Global Climate Change Indicators. Available from- [Accessed 5th December 2018]
  3. NASA Jet Propulsion Laboratory. Ramp-Up In Antarctic Ice Loss Speeds Sea Level Rise. June 13, 2018. Available from- [Accessed December 4th 2018]
  4. Intergovernmental Panel on Climate Change- Climate change: How do we know? Available from- [Accessed 4th December 2018]
  5. Dr. Jan Dash, John Cook. Skeptical Science. Global Warming & Climate Change Myths. Available from- [Accessed 4th December 2018]
  6. Kieran Setiya – School of Humanities, Arts and Social Sciences. MIT News. 3 Questions: How philosophy can address the problem of climate change. February 8th 2017. Available from-
  7. Mona Lebied. Misleading Statistics & Data. Available from: [Accessed 4th December 2018]
  8. Matt McGraph. BBC. Sir Attenborough: Climate change 'our greatest threat'. Available from:[Accessed December 3rd 2018]
  9. Iperceptions. Causation vs Correlation – What's the difference ? Available from: [Accessed December 4th 2018]
  10. Bryan Angliss. Climate Science for everyone: correlation and causation. Available from: [Accessed December 5th 2018]
  11. Nicole Mortillaro. The psychology of climate change: while people deny the evidence. Available from: [Accessed December 8th 2018]
  12. Grist. How to Talk to a Climate Skeptic: Responses to the most common skeptical arguments on global warming. Available from: [Accessed December 1st 2018]
  13. Jean-Daniel Collomb. The Ideology of Climate Change denial in the United States. European Journal of American Studies. 2014. Available from: [Accessed December 5th 2018]
  14. Graham Readfearn. The Guardian. Gina Rinehart company revealed as $4.5m donor to climate sceptic think tank. Available from: [Accessed December 5th 2018]
  15. Constance Lever-Tracy. Routledge Handbook of Climate Change and Society. Routledge international Handbooks. July 12th 2010. Available from: [Accessed December 5th 2018]
  16. Harriet Agnew. Macron promises consultation on green policies after fuel tax protests. Available from: [Accessed November 30th 2018]
  17. HeroicImaginationTV. Asch Conformity Experiment. Available from: [Accessed December 7th 2018]
  18. Detlef F. Sprinz, Håkon Sælen, Arild Underdal & Jon Hovi. The effectiveness of climate clubs under Donald Trump. Climate Policy. 2017. Volume 18. Issue 7. 828-838. Available from: [Accessed December 4th 2018]
  19. Chalmers. Climate change denial strongly linked to right-wing nationalism. Available from: [Accessed December 8th 2018]
  20. Theodor W.Adorno. When the questions of truth become questions of Power. Minima Moralia.1946 Available from: [Accessed December 5th 2018]

Truth, History and Society: How good is Twitter at telling the Truth?

Introduction edit

Social media platforms such as Twitter are used to circulate information, but history teaches us that such vehicles of information can lead to an inaccurate representation of truth. ‘Truth’ is understood as an issue in this chapter, since it is frequently deformed by institutions or individuals, known as misinformation. Misinformation is incorrect information which circulates with a motive or unintentionally. UNESCO refers to it as a method ‘deliberately created to harm’.[1] History demonstrates that, in certain circumstances, historical 'distributors' of truth, such as the Christian Church and national governments are guilty of misinformation when it serves their interest or principles. This historical issue of misinformation is even more relevant today with the existence of social media. Often approached through a political or sociological lens, we aim to provide a previously ignored historical background, to grasp a deeper knowledge on the transmission and impact of misinformation in our society. We will use history as a discipline to address the issue of misinformation and focus on the social networking service Twitter to prove its relevance in the 21st century.

Why is Twitter so popular? edit

It is commonly thought that social media forms such as Twitter are the ideal platform for information circulation. Indeed, it is accessible for everyone to consult but also to edit and it allows a multitude of opinion to emerge on a single question, meaning multitude visions of truth. It is build on the principle of free speech, as its homepage advertises, “We believe in free expression and think every voice has the power to impact the world” (Twitter 2018). Additionally, its structure is shaped specifically to facilitate the distribution of information in an era where we are drowning in knowledge. It does so by limiting the characters of every post to 280 words and registering our worldview and opinions through cookies, which allow it to only show relevant posts and accounts for every user. Hence, the accounts presented to us are limited by our own preferences, which follows the human tendency of establishing contacts with similar individuals, known as homophily. These points- accessibility, freedom of expression and cookies- which make Twitter such an appealing platform are also problematic.Twitter’s inherent structure allows misinformation to emerge and creates confusion about the accuracy of the information we access. However, this issue did not emerge with the birth of social media platforms, rather it has existed for hundreds of years, which this chapter will explore.

Misinformation in History edit

In this section, we will question the Church as an institution and the national government as vehicle of information, and to what extent they can be misleading in the truth they transmit. Both of these institution had a great influence on the public opinion and placed themselves as the holder of the absolute truth. However, History has showed circumstances where both institutions spread disinformation. Along with the Enlightenment movement, philosophers and intellectuals questioned the role of government and pointed out the Church’s dogmatism.[2] The Church considered its morals, principles, and laws as absolute truth. However, the Catholic and Protestant Churches preached different values, and thus different versions of truth, which had violent consequences, such as Bartholomew’s Day massacre in 1572, where Huguenot Protestants were assassinated by Catholics. One such division was created by the protestant John Calvin in his idea of the sovereignty of people, which was severely contested by the Catholic because it questioned their version of the truth. National governments controlled truth to such an extent that they were able to ‘erase’ entire populations. For instance, Australia was declared “nobody’s land” by the British to rationalize their settlements in 1770, which negated 50,000 years of Aboriginal history.[3] Moreover, media control over newspaper, radio and television was another attempt by governments to control truth, such as in France during World War One, where the scale of the war was played down to ensure cooperation by the population.[4] Finally, the cold war is another historical example which demonstrates how nations brutally defend their version of truth, in this case capitalism and communism, by misrepresenting the opposing party through the media.[5] The misinformation the East spread about the West led to John F. Kennedy's famous statement “Let them come to Berlin”, whereby he defended the West by inviting all to come to see the ‘truth’ for themselves. History proves it is easy to get mislead about the truth. By looking at misinformation through the lenses of history, it gives a new perspective to analyse the current political issue around false news in circulation, for instance on Twitter.

The agency of misinformation on society edit

In the past, misinformation has been used by institutions and leaders to promote their values and principles that they take for truth, even when it was not accurate. Today, Twitter, with its impressive 313 million users per month, has the power to promote certain ‘truths’, since its users are influenced by each other in the same way citizens were by institutions in the past . Fake news and lies spread 6 times faster than accurate information, according to a MIT research.[6] Moreover, according to their research, false news can reach up to 100,000 people when truth does not circulate beyond 1000 people. False news is dangerous since it has huge impacts on our societies. The above mentioned MIT paper also demonstrated that false information is considerably more shared when it is in the category of politics. For instance, an increase in the volume of false news posts were recorded during the US presidential election in 2016 and in the Brazilian election in 2018.[7] Consequently, 92% of Brazilians are worried about being able to tell between truth and false online.[8] Fake news becomes an international issue when it leads to the misallocation of resources during attacks or environmental catastrophes or disturbs the financial stability of companies or even a state.[9] Some countries such as the UK or Brazil and companies as Adverifai actively fight fake news trough regulations and policies but this raises controversy.[10] Researcher on this field emphasize the need of large scale models to analyse the tendency of false news and to prevent it in the future.[11]

Conclusion edit

In conclusion, misinformation is a critical current challenge, but it is not new. As history has demonstrated, biased vectors of truth have always existed and impacted people's opinion, whether for political or religious purposes back time and today, hence, the way society functions. Previously studied as an issue within the discipline of Politics, misinformation can better be understood with an interdisciplinary view. Through the lenses of History and Politics, we aim to have a global understanding of it. Thereupon, we invite you to approach this seemingly emerging situation by also considering the past.

Truth in Subconscious Racism

Introduction edit

‘Truth’ is defined as statements in accordance with factual reality, or statements labelled as ‘true’ through popular consensus. The debate surrounding ‘truth’ in racial attitudes elicits interdisciplinary conflict due to varying definitions of 'truth'.

The Implicit Association Test measures automatic associations and thus our subconscious bias. Participants sort a concept with either a positive or negative word (e.g. African-American and pleasant) quickly and accurately [12]. If the stimuli are strongly linked in their mind, this becomes an easier task and they score higher. The results enable us to consider the extent to which these attitudes may indicate our inherent beliefs.

This chapter addresses whether our subconscious reflects our racist beliefs.

Disciplinary Approaches edit

Behavioural Psychology edit

Implicit cognition presents our lack of complete conscious control in social perception and judgement, subsequently allowing our actions to diverge from explicit belief [13]. Arguably, the IAT reflects modal beliefs and individual variation is merely the extent to which one perceives culturally defined principles [14]. However, meta-analytic studies suggest the predictive ability of the IAT [15] with behaviour such as non-deliberate discrimination. Within fictitious interview experiments, non-verbal discomfort and less time spoken to black applicants was observed more frequently in white students with higher IAT scores [16]. Following this study, interviewers controlled these behaviours towards white applicants, causing them to interact more uncomfortably and decrease their performance standard [16]. After removing legitimate reasons, this suggests applicant rejections may be based on subconscious racial bias [17]; formulating a conceivable explanation for the systematically disadvantageous outcomes for African-Americans in healthcare, education, criminal justice and employment [18].

Regarding healthcare, the Unequal Treatment report discovered multiple racial disparities [19] revealing subconscious racial bias as a potential cause, indicative of a deeper truth into one's racial beliefs. For example, a correlation was found between higher pro-white bias in Paediatricians and a decreased likelihood to prescribe post-operative narcotics to black children [20]. These studies conclude that the majority of US health care providers display implicit pro-white bias [21] and black patients express the most negative ratings for interaction friendliness with physicians [22]. Incorporating the IAT within medical training and encouraging interracial relations has reflected positively on the treatment of African-American patients [23].

Sociology edit

The 'truth' derived from the IAT contrasts with more subjective and relative sociological truths, which are formed with respect to an environment. Subconscious beliefs do not necessarily reflect our values as they are inherently imprinted with racial bias from societal norms. Therefore, whilst our subconscious is a result of social constructs, conscious actions can be independent from these.

Sociologist Herbert Blumer argued within symbolic interactionist theory that race prejudice exists from a “sense of group position” [24] where racial groups are “a historical product” [25] and “a result of experience” [24]. Thereby, racial groups would not arise without the experience of racial differences. Blumer thus argued that our racism was intrinsically affected by history, however our consciousness is hardly influenced by these biases, which therefore have no effect on our actions [26]. These racial groups can be the application of Émile Durkheim’s 'collective consciousness', where individuals are bound to said groups by embedded social constructs, of which the individual is rarely conscious [27].

Sociology further refutes the IAT's 'truth' due to its focus on reaction speed. Sociological reflexivity allows the reflection on actions, proven by a study on “Reflexive Intergroup Bias” where individuals either penalised those of their own group (e.g. race) or an opposing group [28]. Biased penalties against opposing groups came with fast responses, whilst with time for reflection, or “rational deliberation” [29], similar penalties were granted. This action thus absolves the 'truth' of the subconscious by allowing conscious values to drive one’s conduct.

Neuroscience edit

Neuroscientific study on subconscious bias focuses on the amygdala, a subcortical structure which uses cognitive biases to process stimuli instantaneously [30]. The connection between high levels of measured amygdala activity and individuals who score high on the IAT was first reported by Phelps who found a ‘significant correlation’ [31], thus concluding that high IAT scores are reflective of ‘subconscious racism’.

However, this conclusion has been criticized within neuroscience. Firstly, high amygdala activity doesn't only occur as proof that subconscious biases are being actively referred to because the amygdala’s subdivisions and connectivity also affects its activity levels. However, these features have not been acknowledged ‘in any depth' [32] by current research, leaving the definite cause of amygdala activity unclear and thus illegitimizing conclusions based upon this activity.  Additionally, alternative neuroscientific research demonstrates that subconscious biases may be innate, as in the case of the ‘inside-outside bias' [32] ; this occurs as a product of evolution when a person quickly categorizes strangers as being 'inside' or 'outside' their own group, based on obvious differences rather than biases caused by learned racial prejudice which could be described as subconscious ‘racism’. Equating the existence of subconscious bias to the existence of subconscious racism is a subjective truth which relies upon one's own understanding of what 'racism' really means; this is not befitting with neuroscience's view of truth as objective and the practice of only asserting that something is 'true' if it can be scientifically measured eg. by an brain MRI scan.

Social Psychology edit

Despite criticisms of the IAT by neuroscientists, social psychologists argue for the IAT’s validity in determining truth regarding subconscious racial beliefs as it provides a more genuine response than other research methods can.

An alternative to collecting data on people's beliefs is explicit self-reported data. By asking participants directly about their values, researchers obtain data on people's conscious views, unlike the IAT which records immediate responses, reflecting the subconscious. While one might predict a significant difference in results from the IAT and explicit self-report measures, research indicates there is in fact a high correlation in data, which tends to be similar in both types of study [33].

A common critique of self-report measures is that they introduce distortions in the data since participants can deliberately skew their answers. Especially when the IAT is on socially sensitive topics such as race, impression management becomes relevant. People tend to control their answers to fit within the frame of what they find socially acceptable. As a result, self-report data is far more subjective and limited in predictive validity [34]. Social psychologists would hesitate to draw conclusions from such data, since they find the results too subjective to establish truth. Therefore, for research on subconscious racism, the IAT is a superior method as it minimizes the bias introduced in self-report measures on socially sensitive topics. Thus, social psychologists would argue the IAT provides a more solid indication of truth regarding subconscious racism than other data collection methods.

Conclusion edit

A monodisciplinary approach to studying subconscious racism ultimately leads to the loss of a broader understanding of the issue as it is approached in other fields. This allows each discipline to arrive at its own conclusive but incomplete and contradicting truth as demonstrated by the disciplinary studies described. As such, the need for future study of subconscious racism to be interdisciplinary is evident, especially given the nature of subconscious racism as fundamentally drawing upon both scientific theory of the subconscious and the societal, subjective concept of ‘race’.

References edit

Truth in Uchronia

Introduction edit

Uchronia, or "that which has no historical reality", is also known as "alternate history". It emerged as a literary genre which explores the possible outcomes in history if an important global event had occurred differently. This article aims to elucidate how uchronia acts a vivid and effective medium for understanding truth. We will be exploring how an interdisciplinary approach through uchronia can provide deeper insight into the disciplines of history, literature, and philosophy. Uchronia wields the power to 'reverse' history and can be viewed as the "black mirror" of the contemporary world, revealing deeply entrenched political corruption and a brawl for power. What collective society holds as true and undeniable may be challenged in fiction, which acts as a conduit for reflection into how we approach history as a subject. In this chapter, we will discuss how truth is constructed in alternate history fiction and the philosophical discourse of truth in history, as illustrated through The Man in The High Castle (1962) by Philip K. Dick.

How is truth constructed in uchronia? edit

"The truth of art lies in its ability to break the monopoly of established reality (that is, of those who established it) to define what is real."[35]

—Herbert Marcuse

Uchronia as a literary genre edit

The concept of truth is inevitably linked to terms such as fact, reality, and authenticity which can be viewed as diametrically opposed to fiction. Uchronia falls within the literary genre of fiction, thus truth in uchronia may be perceived as purely fictional or even nonexistent. However, uchronia is also a literary genre based on historical reality; it must contain some truth since we are able to recognize the authenticity constructed within alternate history and learn from it. In uchronia, authors construct imaginative developments in historical realities which allow us to explore universal matters. Through changing incidental events and exploring the resulting ramifications, we are able to gain an understanding of the essential factors that affect the choices and values which built the world we know today. This reveals how uchronia, as a literary genre, can allow us to perceive certain truths in vivid actuality.

USA in The Man in the High Castle

Divergence point edit

Uchronia paves the way for contemporary reflection on historical determinism by providing another result to a contingent event. The moment the author steps away from historical reality is known as the Divergence point. In The Man In the High Castle, this point occurs in 1933 when Roosevelt's assassination takes place, after which V2 rockets were not requisitioned by the United States but instead contributed to Germany's technical superiority, resulting in World War 2 ending with Germany and Japan's victory instead of the Allied forces. Dick implies that in the 1960s, America's victory was not absolute, and neither would it have brought about permanent peace to the country since the United States never actually ceased to be at war whether it lost in the novel or won in reality.[36] Through uchronia, the author calls for readers to reflect on World War 2 history against the backdrop of mainstream historical discourse in post-war America which was prone to manipulation by government powers of the time.

In typical historical novels, events are not modified but rather enriched with fictional content that does not contradict the historical reality. It can be argued that fiction enhances reality and hence makes it more compelling. In uchronia, authors posit the question of "what if" to construct an alternate reality, lending credibility to the story. Uchronia acts as a comparative dimension to historical reality and encourages active reading; through discerning the differences and similarities between the fictional reality laid out for us and our current reality, important features of truth are revealed to us. It is through learning about the driving forces behind historical narratives, such as greed, envy, and the pursuit of national interests that the reader learns to question his own reality through his own subjectivity.[32] While constructing these truths in alternate history, novelists expose the virtues and vices of the present.[32] In this way, history is "renewed" in modern culture, and alternate history becomes the production of popular culture.[37]

What do Uchronias tell us about History? edit

Challenging History edit

Truth is commonly understood as what objectively exists and has an impact on the "real" world. However, this view can be questioned since many communities have begun to challenge the official and authoritarian character of history.[32] K. Singles states, history is a narrative "with the actual world as a resource of representation"[38]. If this is indeed the case, this would mean there are alternate histories obscured in our collective memory which could prove just as valid as the official narrative of history.

Necessity of history edit

History is not simply the assembling of facts and events from the past, but rather, it is our interpretations of these historical truths and knowledge, which allows us in turn to understand the trajectory of our society. Uchronia does not alter historical truths; it is not an account of the author's perspective on what happened, nor is it a different perspective of past events. It is simply a fictional, imaginative account of what could potentially have happened at a divergence point. Uchronia is important as it teaches us about the variability of human nature and the factors which influence how we construct our understanding of present reality.

History as a lesson edit

George Santayana claims that “Those who cannot remember the past are condemned to repeat it".[39] From an optimistic perspective, uchronia can thus act as a warning against potential threats or events that could reoccur. In the case of The Man in the High Castle, we learn that a society created as a result of the Japanese and Nazis winning the war is not desirable. Hence, we learn to ensure this image of the world is not recreated in our reality. Instead, we create alternatives and find solutions through the fictional medium of uchronia to incorporate into our current reality. Conversely, Historical determinism questions if human beings are the authors of history or the unconscious pawns of historical forces? If uchronia serves as a warning against preventing events such as the Holocaust from happening again, then this would merit a degree of individual freedom. Human beings would then be responsible for their own actions as advocated by existentialist philosophy.[40]

Conclusion edit

Uchronia offers readers a multitude of perspectives to learn from. It can be viewed as an alternate model which helps us make sense of the world refracted through a prism of varied subjectivities. Each author puts forth an argument about their own worldview through various narrative structures. Despite history being grounded in concrete facts which creates the impression of an unchallengeable view, uchronia encourages us to engage with the relativity of history. Fiction can also be seen as a way of questioning the validity of a worldview. One thing is certain though: if human beings cannot attain truth within the narrative of human experience, at the very least they possess the power of fiction, and its ability to manipulate the past in the pursuit for truth. The symbiotic relationship between literature, history and philosophy, emphasises the importance of uchronia as an interdisciplinary medium in which to discern truth.

References edit

Truth in Ethnography

Introduction to Truth in Ethnographic Study edit

This chapter focuses on the superconcept of truth within the discipline of anthropology. The specific anthropological research method that we will be considering with relation to truth is ethnography. Ethnography [41] allows us to explore the day to day lives of culturally diverse, un-westernised societies and uncover social patterns and potentially universal social truths. To ensure our research is specific and focused we decided to structure our wikibook around the question of to what extent can ethnographic study find social, cultural truths in local societies. The functioning definition of truth[42] for this wikibook will be that which is true or in accordance with fact. Alternatively, and more specifically to anthropology, truth can be defined as a fact/ belief that is accepted to be true. Truth is a particularly pertinent issue with relation to interdisciplinary study because it arises in both the arts and sciences. Despite mainly focusing this wikibook within the discipline of anthropology, truth also arises in similar humanities-based subjects such as philosophy with concepts such as correspondence theory of truth [43]. Truth also arises in more scientific and objective disciplines such as logic in maths. In the initial stages of the wiki book we will be looking at applications of ethnography in finding truths in local societies with relation to the Indonesian village of the Donggo [44] before evaluating issues within ethnographic method[45].

Applications of Ethnography in Finding Truths in Local Societies edit

Emile Durkheim – researcher of the suicide rates

Ethnography helps inform us on particularly intricate aspects of life that other disciplines would find difficult to discover without the methods ethnography utilises. The majority of anthropologists find themselves assimilating to the culture of their host communities; enabling participant observation which uncovers truths that would otherwise be unknown. A clear example of this can be shown in Peter Just's fieldwork with the Dou Donggo [46][47]. The ethnographic study of the dispute in the Donggo involves the trial of La Ninde who confesses to a crime he did not commit (the assault of Ina Mone) but admitted to this conviction because "that was more true than what really happened" [48](the truth being he threatened Ina because she knew he was unfaithful). If the villagers accepted this as just; how does that reflect on what we as humans consider to be true or just, and how does that relate to our own western system of law. Peter could only discover this anthropological truth within the local community because ethnography allowed him to uncover truths that even official members of the local community may not have been privy to. It was only because Peter was truly immersed within the local community at the scene that he could discover what truly happened demonstrating the serendipitous nature of ethnographic method when discovering truths.

The Dou Donggo on the island of Sumbawa in Indonesia

Unlike ethnography in the discovery of local truths, a historian would have primarily worked with court records that would have made the case of La Ninde's assault on Ina One completely invisible, further, often primitive communities such as the Donggo do not keep written records which means the discovery of truth is to an extent only possible through immersion. The issues of discovering local truths are not only isolated to the limitations of histography but also to disciplines such as sociology and criminology which rely primarily upon surveys, questionnaires and the analysis of official statistics. A well known positivist survey is Durkheim's exploration into the differing suicide rates[49][50] among Protestants and Catholics, arguing that stronger social control among Catholics results in lower suicide rates. According to Durkheim, Catholic society has normal levels of integration while Protestant society has low levels. However, a survey would unlikely uncover the superficial evidence of the case within the Donggo nor uncover the notion that the conviction was more "true" than the crime he actually committed.

Issues in Ethnography edit

While it is arguable that ethnography allows a deeper and more thorough understanding of a society or social law, there is a counter-argument to say that it is less reliable due to the issue of ethnocentrism [51]. Ethnocentrism can be understood as the judgement of other cultures based on one’s own cultural values. Historically this was true in many ethnographic works around the turn of the 19th century with evolutionary anthropology being the popular perspective of societies and cultures. The key findings in evolutionary anthropology were that societies weren’t static, social progress evolved from savagery to civilisation over many years and social diversity was explained by evolution at different speeds. These ethnographic claims were taken as true which was problematic because it could be used to justify colonial intervention on the basis of speeding up social progression [52]

It can be argued that this example of ethnocentrism is somewhat extreme and this view of society isn’t accepted anymore, however, some say that ethnocentrism is inescapable. Each ethnographer has their own set of beliefs and values from living in a particular culture so it is impossible to observe another society without seeing it through a biased lens. Therefore, can any piece of ethnography be accepted as completely true if it can't study and describe society and culture objectively?[53]

The Nuer People of South Sudan

Finally, although truth in evolutionary anthropology is generally no longer accepted [54], there have still been cases of ethnocentrism by anthropologists who ignore historical contexts of societies. For example in a study of the pastoral Nuer tribe [55][56], Pritchard saw a “balance equilibrium” in a condition of ordered anarchy as a result of "internal mechanisms". What was not taken into account was the fact that the Nuer people were subject to Anglo-Egyptian government, so perhaps ordered anarchy was because they had been pacified by stern imperial rule. This case implies the society is static and, unlike evolutionary anthropologists, romanticised this primitive society.[57] . The anthropologist has a considerable task in taking into account all the complexities of each society, let alone separating their own values from the ethnography, making it hard to take any ethnographic account as completely true.

Conclusion edit

Whilst ethnography is faulted in that we cannot take the discoveries of anthropologists as the absolute truth without considering how their own cultural preconceptions have influenced their findings [58], it is without a doubt one of the most insightful and detailed ways of finding hidden truths amongst local societies. It is essential in uncovering the truth to the intricate, complex ways in which small communities operate within their daily lives, and how these virtually simple interactions can have large scale applications to our own modern day society and how we as humans view the world around us. The "Dispute in the Donggo" demonstrated that ethnography can uncover truths that disciplines such as sociology, history and criminology could have difficulty finding. Different forms of truth can apply to a monopoly of different situations, but what is so deeply important about these truths is how we not only apply them to our own field of study, but to look beyond that. To consider how information found within one discipline can have unimaginable applications within another, if we do not start communicating and linking ideas between disciplines we will never truly move forward and reach our full potential as human beings.

Truth in Reproductive Biology

Sperm and ovum fusing

Humans develop particular truths and perspectives due to their upbringing in a certain time and society. These implicit biases can affect fields that we consider far removed from culture. Our case study first surfaced when Emily Martin analysed reproductive biology descriptions using an anthropological approach. She showed that the scientific truth (objective and positive)[59] differed from the cultural truth (subjective and normative)[60] . Textbooks describe male gametes as active whereas female gametes are passive, but scientifically there is a mutual participation of both sex cells[61]. This chapter explores the intertwining of culture with science in education and how the interdisciplinary issue of truth is subconsciously manipulated to fit culturally determined norms into scientific research. Our chapter will consider this problem through different disciplinary lenses, discuss the implications when science is impacted by culture, and explore how an interdisciplinary approach can improve the field of biology.

History edit

A sculpture of Aristotle(384BC-322BC), a Greek philosopher and biologist, who had a profound influence on Western culture.

Origin edit

Aristotle, a Greek philosopher and biologist, founded the modern understanding of fertilisation[62]. He observed that menstruation came to a halt when a woman became pregnant and that a woman only became pregnant once sexual intercourse had taken place. Aristotle reasoned that there must be an 'active agent'[63] to initiate the process and a substance to be acted on. He believed 'The father'... makes a living creature by the power [...] in the semen...' '[64], therefore the woman's role is passive and the man's role is active. The theory is based on logic and evidence; however it is likely to have been moulded by Aristotle's social environment.[65]

Development edit

Decades after Aristotle, famous philosophers such as Saint Thomas Aquinas (1225–1274)[66] still supported his theory. Even as science progressed, the original narrative did not change extensively. By 1890, it was accepted that the fusion of male and female gametes caused fertilization, but the process was still described with gender-biased language. From the assumption that adult males have a 'shorter life span [and] greater activity'[67] in contrast to females who are 'more passive, vegetative, and conservative'[68], Sir Patrick Geddes and J. Arthur Thomson postulated that catabolism (release of energy[69]) resulted in the birth of a male while anabolism (storage of energy[70]) resulted in a female. This demonstrates the gender-bias that persisted throughout history that influenced the misconstrued truth of reproductive biology.[32]

Biology edit

Fertilisation is the fusion of two unique haploid gametes, the egg and the sperm, into a diploid zygote which then undergoes embryogenesis. When the egg and sperm bind, the acrosome and cortical reactions take place, which dissolve the zona pellucida of the egg, enabling a single sperm to be taken inside. Both gametes are active participants in reproduction.[71]

Although in education the egg is often portrayed as 'a dormant bride,'[72] waiting for the sperm to complete a 'perilous journey'[73] to fertilise it, this differs from the scientific truth.[74] The sperm does not 'swim' to the egg, in fact, 'the forward thrust of sperm is extremely weak'.[75] It is transported by semen and cervical mucus, which additionally 'protects, nurtures, and supports sperm' keeping it in good condition in the uterus.[76] When the egg and sperm bind, both release enzymes enabling the sperm to be taken into the ovum.

Fertilisation is therefore a mutually active, interdependent process.[24] Nevertheless, a recent article revising the language describing it in gynaecology textbooks, found that 'of the 38 times textbooks mentioned the egg, 63.2% were in passive terms, as in ‘released’ or ‘fertilized’', while '67% of sperm occurrences were active.'[77] Furthermore, there was limited mention of the importance of the woman’s cervix and cervical mucus: 84.7% 'described the cervix passively—as a location, destination, object', while only 4.8% 'associated cervical mucus with sperm transport or ascent'.[78] On the other hand, semen was described as 'universally potent and fertile'.[79]

The repetition of this subtle but indoctrinating phrasing, which could easily be avoided, induces a psychological pattern perceiving the female as weak and dependent on the male. Gender-bias is thus reinforced through the anthropomorphizing of sperm (implying an active, 'conscious mission')[16]) and the omission of the vital role of female reproductive organs (suggesting that women are passive and less important).

Arts and Culture edit

Art expressing fertilisation can shed light on the extent of the issue by exposing how the perceived truth is currently omnipresent and reinforced in society. Since art is a medium of expression that can have a profound influence, it can also help to reject the cultural truth and promote a more objective one.

The sperm and ovum during fertilisation: Visual media is also important. Here the sperm is enlarged for educational purposes, but it creates a false illusion of its size and strength in comparison to the egg. The sperm's head furthermore is portrayed as pointy and sharp, ready for the 'penetration' into the egg.

Literature edit

In some informative children's books, the presentation of fertilisation sows the early seeds of gender-biased views. Research conducted in mainland China exposes the socio-culturally influenced perspective on fertilisation [80]. The books personify sperms as boys/tadpoles, in a running/swimming competition with the winner ‘inside a bubble full of flowers’.[81] Such metaphors denote the male competition over the ‘prize' of women, reflecting China’s own cultural atmosphere.

Films edit

In Look Who's Talking, the opening credits carry out a visual portrayal of fertilisation that promote sperm superiority, such as how cries of "yeehaw!" and "jackpot!" are exclaimed from the sperms once the egg is sighted. The 'penetration' into the egg is done after a display of strength and perseverance.[82]

Woody Allen presents a satirical perspective on sex in his film Everything you always wanted to know about sex (but were afraid to ask). The sperm is personified through hyperbole as men with soldier-like uniforms doing physical work whilst the egg is merely a tool.[83] However, the weakness of the sperm is demonstrated through the main character’s fear of entering the uterus. The deconstruction of narratives through pop culture therefore could be a possible solution to gender-narratives in biology.

Education edit

In education, one could argue that the metaphorical language (e.g. 'dormant bride') is a necessary mechanism of simplifying information for a younger and broader audience. Recently, more and more students have been required to take sex education, with it being mandatory in most of the EU and more countries predicted to follow [84]. The AP College Board reported that 'there is a widespread belief in education that it is impossible to expand access while maintaining high performance'[85]. Educators extend this argument, claiming that it is unrealistic to expect science textbooks written for adolescents 'to provide a second-wave feminist critique' of education[86]. Therefore, the metaphors and language in fertilization excerpts could be a necessary reduction to accommodate the increasing number of students. However, these arguments are only relevant to high school, and there are still issues with university-level textbooks having sexist narratives[24].


The use of gender-biased language in science and sex education has been believed to reinforce negative stereotypes and have a delirious effect on both genders[24] and the economy. There is a danger that by 'presenting science in a gendered way', females will be 'deterred from[…] considering a[…] career in science'[24]. In an economic perspective, women being discouraged from technology-related fields cause a loss of human capital due to failure to utilize 50% of the population[87]. This is reflected in real life with recent reports demonstrating 'the better a tech company's gender diversity, the greater its returns'[88].

Conclusion edit

Science is perceived as a field that builds truth through objective methodologies, where 'textbooks serve as authoritative sources of knowledge'[89], therefore it is difficult to find and challenge cultural biases located within the discipline. However, by looking at the issue through an interdisciplinary lens, we can 'recognize areas where gender bias has informed how we think as biologists.'[32] When taking this bias into consideration, fresh perspectives begin to emerge in science, constructing a new narrative that is closer to the objective truth.

References edit

  1. [1], UNESCO. Journalism, 'Fake News' and Disinformation: A Handbook for Journalism Education and Training. Available from: [Accessed 25th November 2018].
  2. [2], Valler I. Roman Catholic Church: A Transnational Actor. 1st ed. Cambridge: International Organization; 2009 Available from: [Accessed 21th November 2018].
  3. [3], Harari YN. 21 Lessons for the 21st Century. 1st ed. London: Jonathan Cape; 2018 Available from:[Accessed 21th November 2018].
  4. [4], D’Almeida F. Images et Propagande. 1st ed. Firenze: Casterman- Giunti Gruppo Editoriale; 1995 Available from: [Accessed 21th November 2018].
  5. [5], Mitter R, Major P. Across the Blocs, Cold war Cultural and Social History.1st ed. Oxfordshire: Frank Cass and Company Limited; 2004 Available from: [Accessed 7th December 2018].
  6. [6], Vosoughi S, Roy D, Aral S. The spread of true and false news online. Science. 2018;359(6380): 1146–1151. Available from: [Accessed 25th November 2018].
  7. [7], Tardáguila C, Benevenuto F, Ortellado P. Fake News Is Poisoning Brazilian Politics. WhatsApp Can Stop It. The New York Times. 2018. Available from [Accessed 6th November 2018].
  8. [8], Cellan-Jones R. Fake news worries 'are growing' suggests BBC poll. BBC News. 2017. Available from: [Accessed 6th November 2018].
  9. [9], Ellick BA, Westbrook A, Kessel MJ. Operation Infektion: The Worldwide War on Truth. The New York Times. 2018. Available from: [Accessed 25th November 2018].
  10. [10], AdVerifai. Artificial Intelligence for Ad Verification. Available from: [Accessed 25th November 2018].
  11. [11], Stephens D, Webber EJ. Fake news is Twitter flu: Chips with everything podcast. The Guardian. 2008. Available from: [Accessed 25th November 2018].
  12. Blink GM. The Power of Thinking Without Thinking. London: Penguin Group; 2005. p.77-92 [Accessed: 25th of November 2018]
  13. Greenwald AG, Krieger LH. Implicit bias: scientific foundation. California Law Review. 2006;94(4): 946. Available from: [Accessed: 27th of November 2018]
  14. Olsen MA, Fazio RH. Reducing the influence of extrapersonal associations on the implicit association test: personalizing the IAT. Journal of Personality and Social Psychology. 2004;86(5): 653-667. Available from: [Accessed 28th of November 2018]
  15. Greenwald AG, Krieger LH. Implicit bias: scientific foundations. California Law Review. 2006;94(4); 960. Available from: [Accessed: 27th of November 2018]
  16. a b c Greenwald AG, Krieger LH. Implicit bias: scientific foundations. California Law Review. 2006;94(4): 962. Available from: [Accessed: 27th of November 2018] Invalid <ref> tag; name ":1" defined multiple times with different content
  17. Greenwald AG, Krieger LH. Implicit bias: scientific foundations. California Law Review. 2006;94(4): 966. Available from: [Accessed: 27th of November 2018]
  18. Greenwald AG, Krieger LH. Implicit bias: scientific foundations. California Law Review. 2006;94(4): 966. Available from: [Accessed 27th of November 2018]
  19. Nelson AR, Smedley BD, Stith AY. Unequal treatment: Confronting Racial and Ethnic Disparities in Health Care. Journal of the National Medical Association. 2002;94(8): 666-668. Available from URL:[Accessed: 30th November 2018]
  20. Sabin JA, Greenwald AG. The in‭fl‬uence of implicit bias on treatment recommendations for 4 common pediatric conditions: pain, urinary tract infection, attention de‭fi‬cit hyperactivity disorder, and asthma. American Journal of Public Health. 2012;102(5): 988‭-‬995. Available from: ‭[Accessed: 2nd of December 2018]
  21. Maine IV, Belton TD, Ginsberg S, Singh A, Johnson TJ. A decade of studying implicit racial/ethnic bias in healthcare providers using the implicit association test. Social Science and Medicine. 2017;199: 219-229. Available from: [Accessed: 3rd of December 2018]
  22. Penner LA, Dovidio JF, West TV, Gaertner SL, Albrecht TL, Dailey RK, et al. Aversive racism and medical interactions with black patients: a ‭fi‬eld study. Journal of Experimental Social Psychology. 2010;46(2): 436‭-‬440. Available from: [Accessed: 26th of November 2018]
  23. Van Ryn M, Hardeman R, Phelan SM, Burgess DJ, Dovidio JF, Herrin J, et al. Medical school experiences associated with change in implicit racial bias among 3547 students: a medical student CHANGES study report. Journal of General Internal Medicine. 2015;30(12): 1748‭-‬1756. Available from: [Accessed: 4th of December 2018]
  24. a b c d e f Blumer H. Race prejudice as a sense of group position. The Pacific Sociological Review. 1958;1(1): 3. Available from: [Accessed: 1st December 2018]. Invalid <ref> tag; name ":3" defined multiple times with different content
  25. Blumer H. Race prejudice as a sense of group position. The Pacific Sociological Review. 1958;1(1): 5. Available from: [Accessed: 1st December 2018].
  26. Lyman SM. Interactionism and the study of race relations at the macro-sociological level: The contribution of Herbert Blumer. Symbolic Interaction. 1984;7(1): 110. Available from: [Accessed: 30th November 2018].
  27. Cole NL. The concept of collective consciousness. Available from: [Accessed: 30th November 2018].
  28. Yudkin DA, Rothmund T, Twardawski M, Thalla N, Van Bavel JJ. Reflexive intergroup bias in third-party punishment. Journal of Experimental Psychology: General. 2016;145(11): 1448–1459. Available from: [Accessed: 2nd December 2018].
  29. Yudkin DA, Van Bavel JJ. The roots of implicit bias. The New York Times. December 9 2016. Available from: [Accessed: 2nd December 2018].
  30. Barlow FK, Sibley CG. The Cambridge Handbook of the Psychology of Prejudice: Concise Student Edition. Cambridge University Press: 2018. Available from: [Accessed: 9th December 2018].
  31. Phelps EA, O'Connor KJ, Cunningham WA, Funayama ES, Gatenby JC, Gore JC, et al. Performance on indirect measures of race evaluation predicts amygdala activation. Journal of Cognitive Neuroscience. 2000;12(5): 732. Available from: [Accessed: 6th December 2018].
  32. a b c d e f g Chekroud AM, Everett JA, Bridge H, Hewstone M. A review of neuroimaging studies of race-related prejudice: does amygdala response reflect threat?. Frontiers in Human Neuroscience. 2014;8: 179. Available from:[Accessed: 6th December 2018]. Invalid <ref> tag; name ":0" defined multiple times with different content
  33. Hofmann W, Gawronski B, Gschwender T, Le H, Schmitt M. A meta-analysis on the correlation between the implicit association test and explicit self-report measures. Pers Soc Psychol Bull. 2005;31(10): 1369–85. Available from: [Accessed: 28th November 2018].
  34. Greenwald AG, Poehlman TA, Uhlmann EL, & Banaji MR. Understanding and using the implicit association test: III. meta-analysis of predictive validity. Journal of Personality and Social Psychology. 2008;97(1): 17-41. Available from: [Accessed: 27th November 2018].
  35. Marcuse H. The aesthetic dimension: toward a critique of Marxist aesthetics. Boston: Beacon; 1978.
  36. Gallagher C. War, counterfactual history, and alternate-history novels. Field Day Review. 2007;36(3):52-66. Available from: [Accessed 6th Dec 2018].
  37. Rodwell G. Whose history: engaging history students through historical fiction. South Australia: University of Adelaide Press; 2013. Available from: [Accessed 6th Dec 2018].
  38. Singles K. Alternate history: play with contingency and necessity . Boston: De Gruyter; 2013. Available from: [Accessed 28th Nov 2018].
  39. Santayana G, Gouinlock J. The life of reason or the phases of human progress: introduction and reason in common sense, volume VII, book one. In: Wokeck M, Coleman M. (eds.) The life of reason or the phases of human progress. MIT Press; 2001. Available from:[Accessed 8th Dec 2018].
  40. Mittal T. To be is to be: Jean Paul Sartre on existentialism and freedom. Available from: [Accessed 8th Dec 2018].
  43. Blackburn, S. (2016). Correspondence theory of truth. The Oxford Dictionary of Philosophy, The Oxford Dictionary of Philosophy.
  46. [12], J. Monaghan & R. Just (2000) Ch1 “A dispute in Donggo: fieldwork and ethnography” pp13-33 Social & Cultural Anthropology: A Very Short Introduction Oxford Uni Press
  47. Just, P. (1986). Let the evidence fit the crime: Evidence, law, and “sociological truth” among the Dou Donggo. American Ethnologist, 13(1), 43-61.
  48. A Friend of Peter Just, Page 16 Line 15-16, A Dispute in Donggo, Social and Cultural Anthropology
  49. Taylor, S. (1982). Durkheim and the study of suicide / Steve Taylor. (Contemporary social theory (London, England)). London: Macmillan.
  50. Durkheim, &., & Simpson, G. (1979). Suicide : A study in sociology.
  51. Dickens, D., 1995. What's Wrong With Ethnography – Hammersly, M. Symbolic Interaction, 18(2), pp.207–216
  52. Stocking, G., 1991. Colonial Situations: Essays on the Contextualisation of Ethnographic Knowledge
  53. Eriksen, T.H., 2010. Small Places, Large Issues: An Introduction to Social and Cultural Anthropology pp.7-9 Third., London: Pluto.
  54. M. Sahlins (1972) The Original Affluent Society in Stone-Age Economics, Aldine
  55. E. E. Evans Pritchard, (1960), “Introductory” pp7-15 in The Nuer: A Description of the Modes of Livelihood and Political Institutions of a Nilotic People in South Sudan
  56. C. D. Johnson’s 1994/7 Ch1. 'The Hammer of the Kujurs’: Government Ethnography. And Nilotic Religions pp3-9 and pp29-34 in Nuer Prophets: A History of Prophecy from the Upper Nile in the 19th and 20th Centuries) Clarendon Press
  57. D. Freeman (1983) Margaret Mead and Samoa. The Making and Unmaking of an Anthropological Myth Harvard Uni Press
  59. Al-Attili A. 3.1 Positive and normative economics [Internet]. 2018. Available from: [Accessed: 10th Dec. 2018]
  60. Gombrich C, Everest J. Truth and the Disciplines. Lecture presented at; 2018; University College London.
  61. Martin E. The Egg and the Sperm: How Science Has Constructed a Romance Based on Stereotypical Male-Female Roles. Signs: Journal of Women in Culture and Society [Internet]. 1991. p. 492-493 ;16(3):485-501. Available from: [Accessed: 28th Nov. 2018]
  62. Biology and philosophy: an overview. (1987). Philosophical Issues in Aristotle's Biology, [online] pp.5-8. Available at:'s%20Biology&f=false [Accessed 9 Dec. 2018].
  63. Ford N, Warnock M. Historical influence of Aristotle on the theory of human reproduction. In: When Did I Begin?: Conception of the Human Individual in History, Philosophy and Science. Cambridge: Cambridge University Press; 1988. p. 19–64. p.26 Available at:[Accessed 9 Dec. 2018]
  64. GROUP, T. (1988). The Importance of Feminist Critique for Contemporary Cell Biology. Hypatia, [online] 3(1), pp.61-76. p.26 Available at: [Accessed 9 Dec. 2018]
  65. Ford N, Warnock M. Historical influence of Aristotle on the theory of human reproduction. In: When Did I Begin?: Conception of the Human Individual in History, Philosophy and Science. Cambridge: Cambridge University Press; 1988. p. 19–64. Available at:[Accessed 9 Dec. 2018]
  66. (2018). Saint Thomas Aquinas (Stanford Encyclopedia of Philosophy). [online] Available at: [Accessed 9 Dec. 2018].
  67. GROUP, T. (1988). The Importance of Feminist Critique for Contemporary Cell Biology. Hypatia, [online] 3(1), pp.61-76, p.62. Available at: [Accessed 9 Dec. 2018]
  68. GROUP, T. (1988). The Importance of Feminist Critique for Contemporary Cell Biology. Hypatia, [online] 3(1), pp.61-76, p.62 Available at: [Accessed 9 Dec. 2018]
  69. Oxford Dictionaries | English. (2018). catabolism | Definition of catabolism in English by Oxford Dictionaries. [online] Available at: [Accessed 9 Dec. 2018].
  70. Oxford Dictionaries | English. (2018). anabolism | Definition of anabolism in English by Oxford Dictionaries. [online] Available at:] [Accessed 9 Dec. 2018].
  71. Gundersen G. Fertilisation [pdf]. Columbia University. Available at: [Accessed 6th December 2018]
  72. Schatten G, Schatten H. The Energetic Egg. Medical World News. 1984;23(5):51-53
  73. Guyton A. Physiology of the human body. 6th ed. Philadelphia: Saunders College Pub.; 1984. p. 613
  74. Martin E. The Egg and the Sperm: How Science Has Constructed a Romance Based on Stereotypical Male-Female Roles. Signs: Journal of Women in Culture and Society. 1991;16(3):485-501. Available at: [Accessed 29th November 2018]
  75. Martin E. The Egg and the Sperm: How Science Has Constructed a Romance Based on Stereotypical Male-Female Roles. Signs: Journal of Women in Culture and Society. 1991;16(3):485-501. p. 492-493. Available at: [Accessed 29th November 2018]
  76. Metoyer A, Rust R. The Egg, Sperm, and Beyond: Gendered Assumptions in Gynecology Textbooks. Women's Studies. 2011;40(2):177-205. p. 190. Available at: [Acessed 6th December 2018]
  77. Metoyer A, Rust R. The Egg, Sperm, and Beyond: Gendered Assumptions in Gynecology Textbooks. Women's Studies. 2011;40(2):177-205. p. 186. Available at: [Accessed 6th December 2018]
  78. Metoyer A, Rust R. The Egg, Sperm, and Beyond: Gendered Assumptions in Gynecology Textbooks. Women's Studies. 2011;40(2):177-205. p. 189. Available at: [Accessed 6th December 2018]
  79. Metoyer A, Rust R. The Egg, Sperm, and Beyond: Gendered Assumptions in Gynecology Textbooks. Women's Studies. 2011;40(2):177-205. p. 194. Available from: [Accessed 6th December 2018]
  80. Yameng Liang J, O’Halloran K, Tan S. Where Do I Come From? Metaphors in Sex Education Picture Books for Young Children in China. Metaphor and Symbol. 2016;31(3):179-193. Available at: [Accessed: 7th Dec. 2018]
  81. Yameng Liang J, O’Halloran K, Tan S. Where Do I Come From? Metaphors in Sex Education Picture Books for Young Children in China. Metaphor and Symbol. 2016;31(3):181. Available at: [Accessed: 7th Dec. 2018]
  82. Heckerling A. Look Who's Talking. Hollywood: Sony Pictures; 1989.
  83. Allen W. Everything You Always Wanted to Know About Sex* (*But Were Afraid to Ask). Hollywood: 20th Century Fox; 1972.
  84. Beaumont K. Policies for Sexuality Education in the European Union. Brussels: European Union; 2013. [Accessed Dec. 9 2018]
  85. College Board. The AP Program Results: Class of 2016. New York; 2016. Available from: [Accessed Dec. 9 2018]
  86. Evans J. Feminist Theory Today: An Introduction to Second-Wave Feminism. London: Sage Publications; 1995. [Accessed Dec. 9 2018]
  87. Dasgupta N, Stout J. Girls and Women in Science, Technology, Engineering, and Mathematics. Policy Insights from the Behavioral and Brain Sciences. 2014;1(1):21. [Accessed Dec. 9 2018]
  88. More Women, Better Investment Returns. Morgan Stanley. 2017. Available from: [Accessed Dec. 8 2018]
  89. Campo-Engelstein L, Johnson N. Revisiting “The fertilization fairytale:” an analysis of gendered language used to describe fertilization in science textbooks from middle school to medical school. Cultural Studies of Science Education [Internet]. 2013. p. 218 ;9(1):201-220. Available from: Accessed: [5th Dec. 2018]

Truth in memory: the reliability of eyewitness testimony

Introduction edit

Living in a “post-truth” society [1], where objectivity is overruled by emotions and personal belief when shaping public opinion, evidence-based truth seems to be endangered. Our society is dominated by individual truths in the form of shared experiences through social media. Those individual truths, due to the limits of personal judgements, are only hypotheses based on generalisation from personal experience and lack of solid proof.[2] However, memory, as a representative of individual truth, is widely used as witness testimonies in courtrooms to supply the application of law, an evidence-based truth system.

Memory in the Courtroom edit

Truth and Law edit

Law is associated automatically with "the search for truth" [3], however, in this discipline, truth is ‘not an explanatorily useful concept’ [4] implying that there is no need for veracity to plea someone guilty. The legal system relation’s with truth is lacking of definition and ambiguous in the sense that it searches “formal legal truth” which is dictated by laws and not by veracity [3]. In other words, if it has been proven by facts, including faulty eyewitness testimonies, the guilt of a party, then the die is cast. Witness's narrative truth comes from their reconstruction of events, which the court considers solid evidence.[3]

Real-life Cases of False Memory edit

In the United States, 300 people where convicted of crimes they didn’t commit and spend years in prison before DNA testing could have proven their innocence. Of those 300 , three quarters were victims of prosecutors' or witnesses' false memories.[5] 19 year-old, Holly Ramona started seeing a psychiatrist for her bulimia and depression problems. With the help of sodium amytal, a drug normally used to treat short-term insomnia – also known as the « truth drug » due to its capacity to restore lost memories – remembered being sexually abused by her father as a child. Prior to seeing her psychiatrist, she didn’t recall any such thing. She then went on to sue her father, Gary Ramona, who himself sued the psychiatrist for having induced false memories to his daughter's brain.[6] False memory syndrome is more common than one might think. Psychologist, Elizabeth Loftus, specialised in false memories, compares it to a wikipedia page : «you can go in there and change it but so can other people ». She conducted different studies proving that misinformation coming from another person can alter one's memory.[7] In Canada, a study [7]made subjects believe that, as a child, they were attacked by a vicious animal and half of them actually "remembered" this episode. Forensic psychologist, Scott Fraser, studies how humans remember crimes. During a TED talk, he elaborates on the case of 16-years-old Francisco Carillo convicted of murder after being recognised by witnesses. He spend 20 years in prison before one of the witnesses (the son of the victim) confessed it was a false memory and that he never saw the face of the murderer.[8] Steve Titus was also wrongfully put in jail for a rape allegations due to the victim thinking it was him.[9]

When giving a testimony, witnesses are under oath : “I swear by Almighty God that the evidence I shall give shall be the truth, the whole truth, and nothing but the truth.” [10] How can involuntary false memory be explained?

The Process of Remembering and Altered Memory edit

Influencing factors on witnesses edit

The accuracy of memories could be affected, due to it being selective and susceptible to influence, by many psychological factors during the three forming stages of memory, namely perception, retainment and retrieval. The perception process is affected by events factors, i.e. the interaction between cases and observers and individual factors, meaning witness’s character and mental conditions. For example, the more violent an incident is, the worse it is remembered. The retainment is the most changeable process since it could be influenced by up to 12 separate aspects, for instance, post-event information, enhancing memory and guessing. Particularly, enhancing memory influence suggests that after-event discussion could shift the memory towards this event. Meanwhile, memory could also be created based on deduction from guessing which matches more to social favours of an image of comprehensiveness. In terms of the retrieval,the environment and type of retrieval, questioning, questioners and other two factors have effects on it. Specifically, studies[11][12] among college students demonstrate that memories are better recalled at the place where the event happened rather than at an unfamiliar place like a police interview room.[13]

Memory Retrieval and Social Suggestions edit

The psychological process of remembering can be understood through Frederic Bartlett's seminal "The War of Ghosts" study[14] as a process of reconstruction. When asked to recall the Native-American story, participants were found to change, to "westernise", to shorten, specific details having been forgotten, and overall distorting the story by adding previously inexistent details or projected emotions. Bartlett’s study shows how the act of remembrance is not a passive action but rather an active one, unconsciously distorting reality.

The courtroom does not provide a space where memory is not subject to change : persuasive language, prejudices , roles of “good and bad” displayed as “prosecutor and defendant”, the formal setting. When confronted to a police line-up, studies[15] show that participants posing as witnesses believe that those on the other side of the mirror, even if innocent, are guilty, due to social construct. 40% of police lineups end in the prime suspect getting picked but, more interestingly, up to 20% end up in the filler (innocent person with similar traits to the suspect) getting picked; demonstrating the effects that expectation and societal bias, have on memory.[15] Thus, sociocultural factors, such as the society (e.g western) they were raised in or their moral bias routed in the location (e.g police line-up), affect the participants perception of the events. Cultures have a direct impact on memories and on the perception of self.[16]

Memory and trauma edit

The negative impacts of traumatic experiences consist usually on that memory being repressed meaning that it isn’t present in the conscious mind but can still be accessible through therapeutical procedures due to its presence in the hippocampus, centre for long-term memory [17]. According to psychiatrist Bessel A. Van der Kolk [18] traumatic experiences affect one’s memory in four ways : Traumatic amnesia (Unconscious of the event until a stimuli triggers the memory), Global memory impairment (More susceptible to suggestions) , Disassociation (Memory is fragmentary), birth of post-traumatic stress disorder (PTSD).Eyewitnesses are directly exposed to stressful traumatic experience, thus generating memory alteration.

Conclusion edit

Loftus’s study[15] demonstrates that, for the same case, when no eye-witness testimony is given only 18% of the participants judged the defendant guilty opposing to 72% when eye-witness testimony is given. The reliance of the legal system on eye-witness testimony, heavily influenced by psychological and neurological factors, is currently being supported by technological and scientific advances, such as DNA testing. "We see that science also rests on a faith; there simply is no science ‘without presuppositions".(p.5) [19] Nietzsche's reluctance on scientism poses the question : Should we trust science when seeking truth?

References edit

Truth in history: the teaching of WW2 in Japan

Introduction edit

History is a formative discipline upon which otherknowledge is grounded. Issues of historical truth can therefore be seen to have a wider social importance; the teaching of history has the power to influence and shape cultural identities, societal norms and political attitudes.

This chapter will highlight these issues within a specific case study, the teaching of WW2 in Japan, but it should be stated that maintaining objective truth in the teaching of history is a global issue. Objectivity in historical education is inevitably constrained by the protection of a cultural identity and by the structural limitations of historical research (1).

Japanese teaching of WW2 history edit

The pre-war Japanese education system was ‘characterised by a high degree of centralisation and domination by the national government’ (2). Until the end of WW2, the Ministry of Education wrote all textbooks. In spite of postwar efforts of liberalisation, they ‘never overturned the dominance of the state in the management of schools’ (2); the government has strict guidelines over the authorisation of textbooks. Sections of textbooks portraying Imperial Japan in a directly negative light (e.g. involving the Nanking Massacre, unit 731, comfort women etc.) are heavily edited down, often to single sentences (3) (4). The Ministry has been known to edit discrete phrases, like editing the word ‘invade’ to ‘advance’ when describing the invasion of Manchuria. This can be seen as a selective processing of historical truth. This systematic omission and minimisation of Japanese aggression has led to condemnation of the Japanese educational system in North-East Asia (4). Contrastingly in Britain there’s coverage of allied atrocities in the classroom, namely Dresden’s bombing.

International tensions have been both a result and a cause of the differences in countries’ contrasting historical narratives (4), theoretically grounded in Foucauldian power-knowledge (5). Korean historical education gives more lesson-time to modern history whilst the Japanese international history curriculum lasts a mere 130 hours a year, covering all periods of history (3) (4). In a standard Japanese history textbook, only 19 of 357 pages (5.3%) are given to events between 1931 and 1945. Furthermore, an ‘examination-orientated’ (6) culture, results in a cursory approach to education. This combination leads to a loss of the nuances of historical tragedies, shown by the matter of fact tone of Japanese history textbooks (4). A comparative study of North-East Asian textbooks found a lack of a common memory of WW2 (4), this difference in national narratives is reflected on the international stage (3).

How has this impacted the cultural identities and international relations of Japan and China? edit

The teaching of history in Japan is a wider interdisciplinary issue because it has caused problems within international relations and cultural identities.

China and Japan have starkly different perceptions of WW2. The Chinese government has used different ways to amplify the historical memory of Chinese people and to ensure that future generations are reminded of the war, with exhibitions that ‘highlight the extreme brutality and sadism of the Japanese military, underscored with graphic images and chilling dioramas of scenes from the war.’ (7) On the other hand, there are memorials in Japan that commemorate convicted war criminals. (8) With places set up as ‘patriotic education sites’ (7), it can be seen that nationalism and identity are inextricably linked with a certain historical narrative in both countries.

Japan’s denial of their troops’ actions during WW2 resonate in Chinese-Japanese relations to this day; some prime ministers have made public apologies, whereas others have denied the actions of Japanese troops during WW2. In 2008, Japan’s military chief, General Toshio Tamogami, published an essay that denied the acts of the Japanese troops during WW2, which won in a competition named, ironically, ‘True Perspectives of Modern and Contemporary History’. (9) In 2015, Prime Minister Shinzo Abe offered his remorse for what China suffered during WW2 but ‘did not offer a new apology of his own’, according to official Chinese media. (10) It is evident that the inconsistencies in the beliefs of Japanese and Chinese representatives have only complicated matters of international relations.

Interesting parallels can be drawn between the Japanese and German conveyance of history. ‘The efforts of Germany to reconcile with its neighbors and allies – and contextualise its national identity – also reveal(s) the strains of trying to come to terms with a difficult past.’ (11) German standards of historical education can be seen to result from their central geopolitical stance within the Cold War which necessitated an attitude of reconciliation. Japan’s geopolitical situation, being placed in antinomy with China, did not result in this conciliatory attitude. With that being said, as we navigate the ways in which countries choose to commemorate their roles in WW2, it is important to note that today’s cultural environments, relations and conflicts have been fostered as a result of each country’s political decisions. (12)

Why is this a problem for the world and interdisciplinarity? edit

This example highlights the interdisciplinarity of the issue of bias in education; a crucial and controversial topic due to its impact on, ‘the formation of young people’s attitudes to other countries, races, and civilisations’. (13) Psychology, history, international relations, politics, and national pride will all affect the way specific events are taught. This can be seen as a universal issue and not just one of Japan; research has been done into the teaching of history in America (14), across Europe (15), and many other places across the world.

Education is inevitably the product of a certain state ideology which undoubtedly impacts national attitudes and societal norms in a reflexive manner, but the topic also raises wider issues concerning bias in historical ‘truth’(16): even in a society where teachers strive to present a balanced argument, they are constrained by the natural limitations of historical research (1).

Efforts to combat this problem are present with international textbook improvements (17); UNESCO have worked hard developing textbook guidelines, as part of their Global Citizenship Education, which over the years has had many successes (18). However, as they acknowledge in their extensive research, ‘a number of models’ are required, with the balance between ‘external intervention and local ownership’ playing an important role in the effectiveness of textbook reforms. Their research also notes the limitations of reform and regards the issue as part of a wider context; educational bias is both a symptom and a cause of international relations, cultural bias, and conflict.

Conclusion edit

Historical truth concerns more than just history. The Japanese conception of its wartime history is problematic due to its contested relation to truth (6). ‘The existence of distinct historical memories’ (19) is arguably resultant from the fragmentary nature of historical narrative formation (1). UNESCO’s attempts to revise textbooks on an international scale is a positive set forward but its limitations should be acknowledged. (20)

However, the continuous minimisation of wartime atrocities enforced by the Ministry of Education shows the difference in collective memories of North-East Asia to be resultant from more than just historical constraints. This educational misinformation is structurally enforced in Japanese schools due to the power of the Ministry of Education, economic imperatives of publishers, limitations of the curriculum and the lack of a will to engage with the iniquitous nature of their imperial past. While the extent to which nationalism, geopolitics, and discourses of truth (19) cause and affect historical education is indeterminate, what is certain is ‘the past in Northeast Asia is very much a part of the present’ (20).

References edit

(1) McNeill, William H. Mythistory, or Truth, Myth, History, and Historians. The American Historical Review. 1986. Available from: [Accessed 29th November, 2018]).

(2) Sugimoto, Yoshio. An Introduction to Japanese Society. 3rd ed., Cambridge University Press, 2010. Available from: [Accessed 29th November, 2018].

(3) Nozaki, Y.. War Memory, Nationalism and Education in Postwar Japan. Oxfordshire. Routledge. 2008. Available from: [Accessed 29th November, 2018].

(4) Shin, G and Sneider D. History Textbooks and the Wars in Asia: Divided Memories. Routledge. 2008. Available from: [Accessed 29th November, 2018]

(5) Ball, Stephen J. Foucault, Power, and Education. New York. Routledge. 2013.

(6) Doi, T. The Japanese patterns of communication and the concept of amae. Quarterly Journal of Speech. 1973. 59(2), 180-185. Available from: [Accessed 29th November, 2018]).

(7) Lau, Julia. Historical Memory and its Impact on Sino-Japanese Relations. Center for Security Studies. 2015. Available from: [Accessed 29th November, 2018]).

(8) Wingfield-Hayes, Rupert. China condemns Japan PM Shinzo Abe’s Yasukuni shrine visit. BBC News. 2013. Available from: [Accessed 29th November, 2018]

(9) CNN. Japan fires military chief over WWII denial. CNN World. 2008. Available from: [Accessed 29th November, 2018]).

(10) Soble, Jonathan. Shinzo Abe Echoes Japan’s Past World War II Apologies but Adds None. The New York Times. 2015. Available from: [Accessed 29th November, 2018].

(11) Hein, LE. Selden, M. Censoring History: Citizen and Memory in Japan, Germany, and the United States. New York. M. E. Sharpe. 2000. Available from: [Accessed 29th November, 2018].

(12) Cole, Elizabeth A. Transitional Justice and the Reform of History Education. The International Journal of Transitional Justice. 2007. Volume 1. (Issue 1). Pages 115-137. Available from: [Accessed 29th November, 2018].

(13) Stobart, Maitland. “The Council of Europe and History Teaching.” Internationales Jahrbuch Für Geschichts- Und Geographie-Unterricht, vol. 15, 1974, pp. 230–239. Available from: [Accessed 29th November, 2018].

(14) Bryant, Margaret. “Teaching History.” Teaching History, vol. 3, no. 10, 1973, pp. 148–148. Available from: [Accessed 29th November, 2018].

(15) Conaill, Tomás O'. “PREJUDICE AND BIAS IN HISTORY TEACHING: A.U.K./IRISH CONFERENCE.” Teaching History, vol. 4, no. 13, 1975, pp. 61–63. Available from: [Accessed 29th November, 2018].

(16) Cowie, Evelyn E. “British Journal of Educational Studies.” British Journal of Educational Studies, vol. 16, no. 3, 1968, pp. 339–339. Available from: [Accessed 29th November, 2018].

(17) Pingel, Falk, ‘UNESCO guideline on textbook research and textbook revision’, 2009, Available from [Accessed 1st December 2018]

(18) Dance, E.H. History the Betrayer. Second Edition. Hutchinson. 1964. Available from: [Accessed 29th November, 2018]

(19) Hein, Laura. “Teaching War is Not Easy: Controversies in Japan, Germany and the United States” History and Public Policy Program. 2001. Available from: [Accessed 5th December, 2018]

(20) Chirot, Daniel. Confronting Memories of World War II: European and Asian Legacies. Seattle: University of Washington Press, 2014. Available from: [Accessed 30th November, 2018].

Truth and Power: Education under Hitler

Hitler-Jugend (1933)

Truth is commonly defined as information that is factual or in accordance with reality. However, history indicates that it is possible for those in power to orchestrate a large scale manipulation of truth. The influence of power on truth can be explored theoretically through the disciplines philosophy and psychology. The real life application of these disciplines can be evaluated through historical evidence using the case study of education under Hitler’s Germany. This is relevant to those doing interdisciplinary studies because the manipulation of truth runs parallel with a strategy of cutting disciplines in education. Essentially, there is a relationship between seeing things narrowly, in one perspective, and being misled.

Philosophy: Foucault's view on Truth and Power edit

The philosopher Michel Foucault researched the creation of truth under a system of power. He created two relevant theories: the conflict between “event truth” and “demonstration-truth”, and “inquiry”.

Michel Foucault

"Event-truth" and "Demonstration-truth" edit

In his book Madness and Civilization, Foucault distinguishes “event-truth” and “demonstration-truth” using the example of a psychiatrist diagnosing someone with madness. “Event-truth” is the truth experienced by the patient and cannot be predicted by the psychiatrist. “Demonstration-truth” arises from a structured, scientific reasoning based on proof. Foucault shows that through diagnosis, the psychiatrist (having power) tries to replace the “event-truth” with the “demonstration-truth”. The psychologist controls the demonstration and therefore imposes a unilateral, monodisciplinary truth on an issue. Without an interdisciplinary approach, those who determine truth can develop an awareness of their power, and gain more control over others[20]. The power dynamic between a teacher and a student is linked to this concept. The "event-truth" is the truth that the student discovers facing the unknown. The teacher has the power to impose a "demonstration-truth" over it.

Inquiry edit

Foucault sees the inquiry of truth as a status of power. Those who are forced to accept the inquiries of others are not empowered by the meaningful process of acquiring and authenticating knowledge, and are therefore easier to manipulate. When the only source of truth stems from those in power, one can be conditioned to believe things that aren't factual[21]. In education, the "inquiry" is done by the teacher. Young students may acquire knowledge without having the capacity to evaluate the truth independently.

Psychology edit

Those in power are able to manipulate people by imposing objective truths. Over time, these truths become embedded in society, through conditioning people to behave in a certain way.

Conditioning in Education edit

Figure 1: Pavlov's Dog

Emotions like fear or contempt can be used to impose truths, whereby any questioning or criticism is responded to by radical systematic measures. This contradicts the principle of Falsifiability. The idea of opposition is seen as an anomaly, which prevents the conscious possibility of disagreeing with power. Furthermore, violence perpetuated by those revolting against the power is seen as wrong, further perpetuating them as an "anomaly" [22].

This situation is close to that of Pavlov's conditioning experiments. In Figure 1, there is a conditioned response when the dog is brought food when the metronome is ticking, making the dog salivate when the food is not there[23]. When free thought is punished, the result is constant self-questioning and anxiousness until the individual's actions conform to the expectations of the person in power[24].

Foucault, Psychology and the Infiltration of Racism edit

According to Foucault, micro and macro levels of power must work in conjunction with each other in order to affect an individual’s beliefs.[25] Racism cannot be fully enforced or regulated by the state, so how does it deal with personal, informal interactions – the bonds and friendships that cut through the education and segregation enforced by sovereignty? There must exist two truths, subjective and objective, where the subjective truth is highly dependent on the views and opinions of ones peers, and the objective truth is more susceptible to imposition by the state.

Goebbels statement, “propaganda can only be effective if it is broadly in line with preexisting notions and beliefs”, explains how malleable the minds of teenagers were under the Nazi regime. Areas of Germany which voted for anti-semitic parties before WW1 and where indoctrination was most effective are still more anti-semitic today.[26]

History: Evidence of Manipulating Truth edit

The philosophical and psychological theories discussed above may be applied to the case study of youth under Hitler. When the Nazi Party (NSDAP) came into power in 1932, part of their agenda was to influence the education system to benefit their political ideals[27]. By 1939, most young people supported the NSDAP and many would denounce their parents to law enforcement [28]. This demonstrates the success of manipulating truth on a large scale through education.

Hitler Youth Hour of Commemoration in front of the Town Hall in Tomaszow, Poland during Nazi occupation. (1941)

Foucault and the School Curriculum During the Nazi Period edit

Poster for the Film, The Aryan (1916)

Syllabi for Geography and Politics were adapted accordingly to encourage a “fanatical devotion to the national cause”[29] and teachers did not allow for inquiry beyond Nazi ideology[30]. Students were punished for expressing their opinions or criticisms freely[31]. This corresponds to Foucault’s ideas on controlling the boundaries of intellectual inquiry as an exercise of power, whereby "demonstration truth" replaces "event truth".

Psychological Conditioning edit

During the Nazi period, disciplines were only linked to the concept of Aryan racial superiority and emptied of any other interpretation or purpose. One famous example is the mathematical problem which depicted people with disabilities as a burden for German society[32] . By forcefully reiterating Nazi ideology in all disciplines, children were conditioned into believing these concepts.

Imposing Subjective and Objective Truth edit

Objective truth during this period was systemically imposed by the state. The NSDAP set up the Nazi Teachers Association, which was compulsory for all teachers[33]. Teachers were retrained to teach Eugenics and “Racial Science” objectively, providing what seemed like formal scientific evidence for Aryan supremacy [34].

Subjective truth was established by youth movements such as Hitler Youth. As opposed to an enforcement of truth by power, German youth felt peer pressure from friends in the same social status as their own [35]. This may have been more convincing to children who had trouble accepting truth by authority.

Physical Education edit

Physical Education was changed from 2 to 10 hours a week minimum in primary schools at the expense of subjects like Religious Studies or languages[36]. The idea was to make the physical fight worthier than the intellectual fight. A lack of intellectual resilience meant students wouldn't challenge the knowledge system set by the NSDAP. This implies a lack of interdisciplinary thought made children more vulnerable to accepting truth from power.

This led to confirmation bias; those who believed in the Nazi ideology became the majority and the general acceptance of this truth was a source of validation [37]. Challenges to this validation were viewed as a consequence of genetic and mental illness, underlining the growing lack of critical thinking in the society.

Sources edit

Truth in Politics

Introduction edit

Oxford Dictionaries define 'post-truth' as "relating to or denoting circumstances in which objective facts are less influential in shaping public opinion that appeals to emotion and personal belief"[[38]. The term is deeply embedded in politics as Newt Gingrich argued what people feel about the truth is more significant than actual facts in politics[39].

Most Western countries, where the term 'post-truth' was mostly used, are democratic countries and the word democracy originates from Greek words ‘demos’, meaning people, and ‘kratia’ meaning power. Thus, democracy signifies ‘rule by the people’ yet ‘demos’ also means ‘mob’ illustrating that politics can be led by the majority of ignorant people[40]. Therefore, 'post-truth' in politics is perhaps inevitable and it may interpret factual truth in other disciplines based on political belief.

Climate change: scientific and political truth edit

The CBS poll indicated that just 11% of strong Trump supporters believe in mainstream media while 91% of them trust him.

Scientific truth is often considered as objective and Intergovernmental Panel on Climate Change(IPCC) who assesses climate change from a purely scientific perspective, concluded that the temperature of the Earth will increase 3C° and that it is related to human interactions[41]. However, political truths are different. Trump argues that scientists are not free from confirmation bias and they have 'political agenda'[42]. He also suggested the temperature could well fall and, thus, does not want to spend trillions of dollars combating global warming. Despite he hasn't provided objective evidence to explain his belief, the validity of his claims is irrelevant. The poll suggested a few strong Trump supporters trust the mainstream media, whereas the majority of them believe Trump[43]. Therefore, what he believes becomes political truth, at least to his supporters, who as the majority determines actions in politics. Consequently, Trump could cut $2.8bn budget on the Environmental Protection Agency(EPA)[44] and reduced the influence of scientific truth. Therefore, in recent politics, scientific truth is less likely to correspond to the public view of truth, and political truths may eliminate any other truths in climate change such as human geography in order to obtain desired conclusions

Truth in law and political truth edit

Similarly, truth in law strictly considers objective evidence and judgments are bounded by limitations such as the Constitution. However, Brett Kavanaugh, now the Associate Justice of the US Supreme Court, was accused of sexual misconduct carried out in the early 1980s. During his appointment process, the allegation was debated hastily, nonetheless, more so in relation to whether he should be confirmed as the Supreme Court Justice regardless of truth in law[45]. People who opposed the appointment argued that even if the nominee denied the allegation, the appointment should not continue until FBI investigation while Republican advocates of Kavanaugh said he is entitled and well qualified for the role anyway[46]. Ultimately, he has been appointed for the Supreme Court Justice without the investigation.

The US Constitution, the most fundamental principles in law, insist the law must be implied the same regardless of whom[47] and, thereafter, the truth should not vary. Yet, it is uncertain the rule of law is applied when it is defining a political truth. Consequently, truth in law can alter when the case is loaded with political questions in order to favour certain political belief.

Ways of Knowing in Politics edit

Essential WOKs in politics are reason, language, and emotion. Locke argues: "The freedom of man and liberty of acting according to his own will, is grounded on his having reason "[48], hence reason enable citizens to establish a just political regime through democracy – without it, the electoral process is flawed. Contrarily, emotion has been portrayed as a detrimental force undermining our capacity to reason[49] and therefore must be controlled, if not extirpated. This conception has dominated the classic Greek period and remains influential in modern psychology. Despite classical economists assuming human rationality, psychologists – like John M. Grohol, who believes emotion dragged America into Vietnam – and Iraque war[50] – argue emotions can undermine a person’s capacity for rational decision-making. Hence emotion in political decision existed before post-truth era, illustrating the natural impossibility of truth without emotion in politics. Accepting the classical approach which denounces emotion, democracy with a perfectly informed electorate is indeed also impossible. However, there are arguably examples where emotion benefits political truth. Emotion reaches a more profound truth in relation to the refugee crisis. Truth is obscured by emotion, however, when demagogues manipulate national emotions such as anger or fear. History shows this pattern in Western democracies – the rise of extremist views and autocratic leaders[7]. When emotion prevents truth, this has ramifications beyond politics – most notably the level of democracy, but also economic theory predicts news market failures in response to the rise of fake news[51].

Hans Rosling, famous statistician, wholeheartedly denies Mark Twain’s quote: “Lies, damn lies, and statistics”[52], and instead argues, statistics highlight widely held misconceptions and “tell us if the things we think are actually truth”.[53]

Statistics justifying political truth edit

Statistics as evidence produces knowledge by justifying a belief to make sure that it is true as according to Plato’s definition of knowledge[54]. Nevertheless, statistics in health, particularly diet, consistently denotes the opposite, namely ambiguity and conflicting conclusions which can lead to confusion and even mistrust[15]. In a study, 37% of Americans agree that “research about the health effects cannot be trusted because so many studies conflict.”[15] Perhaps in practice, due to casual claims[55], collective statistical illiteracy[56], and economic aims of food – pharmaceutical industries, the use of statistics in politics as well as other disciplines, is flawed in justifying a truth. Statistical data doesn't allow for lies so much as semantic manipulation: numbers drive the misuse of words.[57]

An Inconvenient Truth edit

The title, HyperNormalisation, describes life, when people understood the insanity of propaganda from the government but had difficulty foreseeing an alternative. Curtis narrates in a voiceover how Trump realized that "in the face of that, you could play with reality" and in the process "further undermine and weaken the old forms of power." [58]

Political truth transformed factual truth into a controversial topic by making it changeable or interpretable. Hence, factual truth changed not only in politics but also in other disciplines. Politicians capitalised on people’s psychologies. As technology improved it became easier for leaders to operate with their own facts by triggering their emotions, which undermine their capacity to reason. It made them believe unreasonable 'truths'. The issue of truth in politics is displayed in various disciplines. An Inconvenient Truth, a documentary by Al Gore[59], emphasizes the severe issues of climate change. It raises awareness on the issue and demonstrates that, even when scientific facts are provided the determination is nonexistent due to the "truth" in politics. For instance in filmmaking, HyperNormalisation,[60] documentary by Adam Curtis, is a contemplation on life in the post-truth era. According to conspiracy theorist, Renee DiResta, the internet no more reflects the exclusive truth it shapes the entire reality that operates with its own facts.[3] Politicians use language to manipulate people's thoughts and decisions. George Orwell wrote, "political chaos is connected with the decay of language."[61] which states that language creates a gap between leader's real goals and declared aims. President Trump spreads his own truths and creates a disoriented public.[3] Therefore he adapts everyday language to control how people communicate. As a result, politicians distort the truth in law and truth in science, such as climate change.

Emotion in politics is inevitable according to psychology and history, however, this combined with Greek philosopher’s view on emotion, hasn't prevented democracies thriving. There can never be one political truth – individual truth depends on perception, and collective truth changes constantly. However, citizens have a basic right to information in a democracy[7] and the level of factual truth affects the political structure, the economy, and the knowledge produced across multiple disciplines.

  1. Harari Y.N ; Are we living in a post-truth era? Yes, but that’s because we’re a post-truth species. Ideas.Ted.Com. 7 September 2018; Available : [Accessed : 7 December 2018]
  2. Taylor W.S., Is Truth individual or social, Journal of Social Psychology, Vol. 6, Iss. 3, Aug 1, 1935. p. 348.
  3. a b c d e Fernandez J.M; An Exploration of the Meaning of Truth in Philosophy and Law’', Australia 11 The University of Notre Dame Australia Law Review 53 ; 2009 Available from : [Accessed : 7th December 2018] Invalid <ref> tag; name ":0" defined multiple times with different content
  4. Patterson D., Law and Truth , 1st edition, USA : Oxford University Press, 1996
  5. Torgovnick May K., The fiction of memory, TedGlobal, 2013 Available from: [Accessed: 8th December 2018]
  6. August Piper Jr., M.D. Is there a truth serum?, December 2013, Available from: [Accessed: 7th December 2018]
  7. a b c d Loftus E.F., How reliable is your memory?, TedGlobal2013, 2013 Available from: [Accessed: 7th December 2018] Invalid <ref> tag; name ":1" defined multiple times with different content
  8. Fraser S., Why eyewitnesses get it wrong, TEDxUSC2012, 2012 available from: [Accessed: 7th December 2018]
  9. Henderson P., Looking back at Titus case,1981, Available from: [Accessed: 7th December 2018]
  10. Pigott R., Motion to end Bible oaths in court defeated, BBC News, 19 October 2013, Available from : [Accessed : 9th December 2018]
  11. Abernathy, E.M., ‘The effect changed environmental conditions upon the results of college examinations’,10,Journal of Psychology, 1940, p.293-301
  12. Feingold, G.A., 'The Influence of environment on identification of persons and things’, 5, Journal of Criminal Law and Criminology 1914, p.39-51
  13. Loftus E.F., Wolchover D.and Page D., 'General Review of the psychology of Witness Testimony'.Witness testimony : Psychological, investigative and evidential perspectives , Oxford: Oxford University Press. 2006, pp. 8-17
  14. Bartlett F.C., Remembering: A study in experimental and social psychology, Cambridge, UK: Cambridge University Press, 1932
  15. a b c d e Wright D.B and Loftus E.F, Eyewitness memory. In : Cohen G., Conway M. (eds.) Memory in the real world Psychology Press, 3rd edition, 2008, p.95-97 Invalid <ref> tag; name ":2" defined multiple times with different content
  16. Williams H.L., Conaway M.A. and Cohen G., Autobiographical memory. In : Cohen G., Conway M. (eds.) Memory in the real world Psychology Press, 3rd edition, 2008, p.71
  17. Lynn S.J and McCconkey K.M, Truth in memory , The Guilford Press, 1998, p. 331-332
  18. van der Kolk B.A. & Fisler R., Dissociation and the Fragmentary Nature of Traumatic Memories:Overview and Exploratory Study, 1995 Available from : [Accessed : 7th December 2018]
  19. Cristy R., "“Gay Science” as a Conditional Will to Truth", Princeton University Available from : [Accessed : 9th December 2018]
  20. Blais L. Savoir expert, savoirs ordinaires : qui dit vrai ? : Vérité et pouvoir chez Foucault. Sociologie et sociétés [Internet]. 2007 Sept [cited 2018 Dec 09]; 38(2):151–163. Available from :
  21. Foucault M. Truth and Juridical Forms. Social Identities [Internet],2010 Aug [cited 2018 Dec 02]; 2(3):341. Available from:
  22. Koch, H. W. (1996). The Hitler Youth: Education, 1922–1945. New York: Barnes and Noble. ISBN 978-0-88029-236-8.
  23. Hart-Davis, A. Pavlov's Dog. Modern Books; 2018
  24. Gallistel, C. R et al. The Symbolic Foundations of Conditioned Behavior. New Jersey: Lawrence Erlbaum Associates; 2002
  25. Hook, D. (2007), Foucault, Psychology and the Analytics of Power, page 221,
  26. Voigtländer N, Voth HJ. Nazi indoctrination and anti-Semitic beliefs in Germany. Proc Natl Acad Sci U S A. 2015;112(26):7931-6.
  27. Teaching Holocaust and Human Behavior: Youth and the National Community [Internet]. Facing History and Ourselves. [cited 9 December 2018]. Available from:
  28. Life for young people in Nazi Germany [Internet]. BBC Bitesize. [cited 9 December 2018]. Available from:
  29. Snyder, L. Encyclopedia of the Third Reich. London: Hale; 1976. pg79
  30. Evans, R. The Third Reich in Power. New York Penguin Press; 2005. pg270
  31. Education in Nazi Germany [Internet]. Spartacus Educational. [cited 9 December 2018]. Available from:
  32. Snyder, L. Encyclopedia of the Third Reich. London: Hale; 1976. pg79
  33. Nazi social and economic policies [Internet]. BBC Bitesize. [cited 9 December 2018]. Available from:
  34. Buller, E. Amy. Darkness Over Germany. London: Longmans, Green; 1943
  35. Mackinnon, M. The Naked Years: Growing Up in Nazi Germany. London: Chatto & Windus; 1987
  36. Koch, H. W. The Hitler Youth: Education, 1922–1945. New York: Barnes and Noble; 1996
  37. Kershaw, I. The Führer Myth: How Hitler Won Over the German People [Internet]. Spiegel Online. [cited 9 December 2018]. Available from:
  38. McComiskey, Bruce. "Post-Truth Rhetoric and Composition." In Post-Truth Rhetoric and Composition, Boulder, Colorado: University Press of Colorado, 2017, p1-50
  39. Forbes, Ethan Sigel, 'Newt Gingrich Exemplifies Just How Unscientific America Is', 5/08/16. Available from:[Accessed 30th November 2018]
  40. Jonathan Wolff, An Introduction to Political Philosophy, Oxford University Press, 2006, p67
  41. Zedillo Ponce de León, & Zedillo Ponce de León. Global warming : Looking beyond Kyoto / Ernesto Zedillo, editor. (UPCC book collections on Project MUSE). New Haven, Conn. : Washington, D.C.: Center for the Study of Globalization, Yale University ; Brookings Institution Press.(2008), p15-17
  42. BBC, 'Trump: Climate change scientists have 'political agenda'', 2018/10/15, Available from:[Accessed 27th November 2018]
  43. CBS News, Anthony Salvanto, Jennifer De Pinto, Kabir Khanna and Fred Backus, Trump backers stand by president in face of Russia criticism — CBS poll, 29/07/2018, Available from: [Accessed 23th November 2018]
  44. Independent, Mythili Sampathkumar, 'Donald Trump's budget proposal includes major cuts to environmental programmes', 2018/02/12. Available from: [Accessed 30th November 2018]
  45. The Washington Post, Sally Kohn, 'Kavanaugh isn’t entitled to a Supreme Court seat, just as men aren’t entitled to sex', 2018/09/24, Available from: [Accessed 23rd November 2018]
  46. ABC news, Meridith McGraw, 'At Las Vegas rally for Republican candidate, Trump says Kavanaugh 'is going to be just fine', 2018/09/21, Available from: [Accessed 27th November 2018]
  47. Tamanaha BZ. Classical origins. On the Rule of Law: History, Politics, Theory. Cambridge: Cambridge University Press; 2004. p. 7–14.
  48. Grant, Ruth W. John Locke's Liberalism. University of Chicago Pr, Chicago and London, 1987.
  49. George Marcus, Oxford Handbook of Political Psychology, Edition: 1, Chapter: The Psychology of Emotion and Politics, Publisher: Oxford University Press, Editors: David O. Sears, Leonie Huddy, Robert Jervis, pp.182-221
  50. Grohol, J. (2016). Humans are governed by emotions. Psych Central. Retrieved on December 7, 2018, from: [Accessed 7th December 2018]
  51. Martens, Bertin, et al. “The Digital Transformation of News Media and the Rise of Disinformation and Fake News.” European Commission, JRC Technical Reports, Apr. 2018.
  52. “Lies, Damned Lies, and Statistics.” Wikipedia, Wikimedia Foundation, 24 Nov. 2018, from: w:Lies, damned lies,_and_statistics[Accessed 7th December 2018]
  53. Smith, Edwin. “Hans Rosling: the Man Who Makes Statistics Sing.” The Telegraph, Telegraph Media Group, 7 Nov. 2013, from: [Accessed 7th December 2018]
  54. "Propositional Knowledge Definition. The Columbia Encyclopedia, 6th Ed,, 2018, from: [Accessed 7th December 2018]
  55. Butterworth, Trevor. “Is Soda A Smoking Gun For Teen Violence – Or Just Statistical Illiteracy?” Forbes, Forbes Magazine, 6 Nov. 2011, from: [Accessed 7th December 2018]
  56. Gigerenzer, Gerd, et al. “Helping Doctors and Patients Make Sense of Health Statistics.” Journal of Research in Crime and Delinquency, 1 Nov. 2007, from: [Accessed 7th December 2018]
  57. Goodman, Jonathan R. “How Statistics Are Twisted to Obscure Public Understanding – Jonathan R Goodman | Aeon Ideas.” Aeon, 7 Dec. 2018, from: [Accessed 7th December 2018]
  58. HyperNormalisation. [Film] Directed by: Adam Curtis. UK: BBC; 2016.
  59. An Inconvenient Truth. [Film] Directed by: Davis Guggenheim. USA: Lawrence Bender Productions; 2006.
  60. HyperNormalisation. [Film] Directed by: Adam Curtis. UK: BBC; 2016.
  61. Orwell, George. Politics and the English Language. 1946. Available from: [Accessed 7th December 2018].

The Relationship Between Truth and Politics

Introduction edit

Political language – and with variations this is true of all political parties, from Conservatives to Anarchists – is designed to make lies sound truthful and murder respectable, and to give an appearance of solidity to pure wind. – George Orwell, 1946

Truth is a concept that can sometimes be thought of as fundamental and single-minded. A statement is either true or false. But in reality, this conception can be segmented into a variety of subcategories. For example, truth can be observed in a wide array of humanities such as history or anthropology or in more scientific areas such as mathematics, biology or physics. Its application can in fact be seen in politics. Truth is defined as the adequation between a judgment or an opinion, and the reality. On the other hand, politics is described as ‘The activities associated with the governance of a country or area, especially the debate between parties having power’[1]. It is collectively admitted that the exercise of power is not possible without a certain practice of lie and manipulation. Many early-age philosophers and even more modern thinkers such as Hannah Arendt had already regarded truth and politics as being antithetical. Through language, politicians can gain influence and power over a large group of people and can therefore use speeches and sermons to sway the masses to see their point of view. This means that a lying politician can be viewed as honest if he perfectly masters the art of demagogy and rhetoric. We have reached a point where society seems more likely to trust those who appeal to emotions and feelings rather than rational facts in order to convey their ideas. The era in which we live in today is indeed more and more perceived as a ‘post-truth’ era. Oxford dictionary defines the adjective ‘post-truth’ as ‘relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief’[2]. The intention of this essay is to showcase the importance of interdisciplinarity in the understanding of complex issues and how it enables us to achieve a more holistic way of addressing real-world problems. By using philosophical theories, we are able to tackle a much wider perspective and a deeper knowledge of the current phenomenon of false information in politics, and this is made possible through the illustration of the ongoing Brexit.

Rational Truth and Factual Truth edit

Rational truths are 'the products of our mind such as doctrines, theorems, arguments, equations and scientific models’[3] which can be proved right, by referring to simple reasoning and valid relations to a theoretical method or criterion. These truths are interconnected with logical truth which points to reasonable and correct outcomes considering a problem, for example for ‘1+1=2’ where the human mind automatically deduces the correct outcome.

Factual truth is the inevitable outcome of human beings in the course of their existence, it is what actually happens. And as Arendt defines it, facts are “the inevitable outcome of men living and acting together”[4] and thus they are very prone to get used for personal gain by changing the actual reality into something more beneficial.

Thus, as facts concerning human affairs are not based on a theoretical or rational method, it is unnatural for a human mind to not alternate the outcome if there is not one right answer based on scientific deduction. Indicating that factual truths are more exposed to destruction when someone attempts to modify them. Another distinction between the discussed truths is that rational truth can be falsified, however, factual truths are often turned into lies in the pursuit of a certain gain from that lie. This act of turning factual truth into lies is often referred to as ‘organised lying’ which is defined by fabricating “images” of reality with an opportunistic goal, presenting them as “true” and leading others to believe that they are “true".[5][6]

Organised lying edit

Defined by “the relatively recent phenomenon of mass manipulation of fact and opinion as it has become evident in the rewriting of history, in image-making, and in actual government policy” [7], organised lying has an important relevance to our social life today. Through concocting stories, in other words alternating the factual truths for the purpose of a rise to power, politicians are able to win over like minded-people to vote for them or enforce a particular ideology upon our society. This is particularly common because factual truths are within the grasp every individual, making it generally accessible to consider fatal truths and form them into lies as they are no dividing borders between facts, lies and opinions. Organised lying can be persistently found in political campaigning and one recent example of it is the UK’s referendum referred to as ‘Brexit’.

Brexit edit

On June 26th 2016, the British Public voted in favour of leaving the European Union by 51.89% to 48.11%[8], in a political saga known as Brexit. A huge factor in this outcome was the strength of the Vote Leave campaign. Vote Leave was selected by the Electoral Commission as the primary campaign group for the motion of leaving the European Union in the approaching referendum on 13th April, entitling the group to £600,000 of public money and a higher spending limit of £7 million[9]. In light of the victory of the Vote Leave campaign coupled with the government's inability to negotiate a Brexit deal that satisfies the expectations of voters and the Conservative Party, the actions of the Vote Leave campaign have come under great scrutiny, specifically with claims made that lead to the public's decision to leave the EU.

The NHS edit

The most infamous claim made by the Vote Leave campaign was the long-debated '£350 Million' statistic. Printed on the side of the "Brexit Bus," it stated 'We send £350 million to the EU every week, let's fund our NHS instead.' This became a hugely significant point of contention between leave and remain voters. In an article for the Telegraph in 2017; leave campaigner and Foreign Secretary (at the time), Boris Johnson reiterated this claim by stating "And yes – once we have settled our accounts, we will take back control of roughly £350 million per week[10]." However, the UK Statistics Authority have said that the £350 million claim is "misleading and undermines trust in official statistics[11]." FullFact, claims that we actually pay the EU 'around £250 Million,' but also that 'we get some money back from the EU.' It would appear that the £350 Million claim by the Vote Leave campaign is an over simplification.

It must also be noted that the issue of the NHS is not simply economic, as a health service it is clear that there will be genuine human lives at risk as the uncertainty surrounding Brexit increases. A BMA survey in November 2018 found that 35% of EU doctors have considered leaving the UK following the referendum vote[12]. These are skilled individuals whom conduct vital medical research as well as caring for our most vulnerable members of society. The negative effects of this uncertainty and underfunding could have consequences ranging from stress and discomfort to the cancellation of vital cancer treatments.

References edit

Truth in Medicine

Introduction edit

Effective treatment depends upon an accurate and effective diagnosis therefore Medicine is dependent on the understanding of truth, from initial diagnosis, to treatment and prevention. This WikiBooks chapter will explore the issues that arise due to the variations in truth within the medical field as an estimated 15% of all medical cases are inaccurate. The patients, doctors and Artificial Intelligence (AI) are the sources in which truth must be acquired and utilised, in which through interdisciplinary approaches explores the origin of discrepancy in truth. The philosophical scope of the issue looks at the concept of strong and weak AI, the ethical ramifications and the issues that could arise upon the implication of AI in medicine. Using the work of philosophers such as John Searle to evaluate the solutions to the issue of misdiagnosis. Finally, this extract will analyse the social and economic impacts of misdiagnosis.

Medical truth edit

There are many reasons why truth in medicine is not always uncovered, leading to misdiagnosis, delayed recovery and depletion of resources. Initial diagnosis is vital, however due to human behaviour 15% of patients are categorised as ‘difficult patients’ according to practitioners.[13]

Patient: The behaviour of patients influences directly on the diagnosis of their illness, such as rude comments, demands and threats towards the doctor.[14] Even though time taken for doctors to diagnose a patient doesn’t vary however, the accuracy of ‘difficult patients’ diagnosis was drastically lowered by 20%.[15] The difficulty in obtaining truthful information from patients or the lack of co-operation may result in wasting time and resources.

Doctor: Doctors use sense perception and reasoning when diagnosing patients but these ways of knowing is prone to human errors. The practitioner’s initial diagnosis- not necessarily accurate- becomes their final diagnosis due to the difficult situation. The uncomfortable environment created becomes distracting, therefore following statements given by the patient may not have been considered by the doctor so the true illness may remain unveiled.[16] As well as this patients behaviour may trigger emotional responses, in particular anger, affecting judgement as they aren’t using their full capacity to evaluate their findings.[17] In addition, more brain capacity is directed to responding accordingly to a difficult situation, allowing less analytical process for diagnosis.[18]

Illness: Even when patients suffer from the same illness, each medical case is unique as each person’s body works, medical history and personal experiences differ. As ‘Chronic pain’ -pain occurs for longer than 12 weeks-illustrates the similarity between patients is the length of issue, yet everyone’s experience is diverse. The diagnosis of chronic pain is entirely dependent on the patient’s description of experience, such as degree of pain, location and type of feeling. Pain is subjective, therefore effectiveness of different treatments is personal, so there is no single treatment that will cure every case.[19] The lack of a universal truth in treatment makes it time consuming, possibly prolonging the suffering.

Ethical Ramifications of Artificial Intelligence edit

To avoid the issues related to misdiagnosis by doctors of difficult patients, artificial intelligence could be used to provide an unimpeded by emotion diagnosis. Tests of this new technology have shown that artificial intelligence does outperform doctors in some aspect as they have the capacity to test thousands of theories in under a second.[20] However, ethical considerations must be made when thinking about data protection and the ways in which this information could be abused. Through maintaining a large online database for medical AI, this could be creating huge vulnerabilities for modern society as hacking becomes more and more commonplace. There are also worries of the government using such information as a form of covert surveillance.[21]

Strong AI vs Weak AI edit

John Searle’s concept of strong AI versus weak AI brings into question the nature of AI and if it can really replace a human interaction. Strong AI refers to AI that tries to completely replicate the actions and thought process of a human, alternatively weak AI is only made as an information processing machine.[22] One could argue that medical AI needs to only be weak AI, to act as a catalyst for running through the possibilities at a diagnostic stage. With the development of AI being reliant on the capacity of the programmer to encode the system with a capable diagnostic system, this creates a new issue, as the technology is only as strong as the research and programmers behind it. This raises the issue of who is responsible when negative outcomes arise as a result of using AI informed diagnosis.[23]

Social impact edit

Misdiagnosis is an undermined problem, in developed countries most patients will be misdiagnosed once in their lifetime[24], often leading to life-threatening consequences. Yet there are very few organisation engaged in a system to reduce the frequency of misdiagnosis.[25]

Those impacted by misdiagnosis are:

Patients: One in ten patients are harmed in the treatment that they receive in hospitals.[26] Between 40 000 to 80 000 patients die every year because of misdiagnosis.[24] Those most likely to be misdiagnosed are the ones in poorer health and the elderly[27].  Misdiagnosis is particularly present in patients suffering from cancer, fractures and scaphoid.[28] Patients sometimes have to pay for lifelong care for permanent disabilities. Furthermore, misdiagnosis has repercussion on the social life of the patient such as the loss of a family member or losing the ability to work.

Institutions: On a larger scale, this leads to hospitals losing a significant amount of time, because patients need to be rediagnosed which hugely impacts their reputation as well as their income.[25] Businesses lose experienced employees which brings down their productivity, adding to that comes an increase in insurance payouts.

Economic impact edit

The cost of misdiagnosis is increasing faster than any other component of health care expenditures, impacting the budget of patients, hospitals and public programs. According to the Institute Of Medicine (IOM), in the United States, 30% of annual health care spending’s- 750 billion US dollars- are wasted on unnecessary services.[24] Similarly, in the United Kingdom, the National Health Services (NHS) hospitals lose 197,2 million pounds per year due to cases of misdiagnosis.[28] Bringing the average price of a misdiagnosis to $386, 849 per claim. Misdiagnosis not only has a huge financial impact on the resources of hospitals and the patients but public assistance programs suffer too.

Conclusion edit

As medical knowledge improves, aided by the advance of technology, it is likely that we will be solving many problems in the future with the help of AIs. With the development of AIs, there are still strong concerns relating to safety of AIs as well as the effectiveness for it to self-operate with no human interference. However, at present we are still dependent on the expertise of trained physicians. Finally, in order to reduce the lack of truth medicine we are bound to efficient communication between patient and doctor and the progress of research to understand more about diseases.

Truth and Art

Introduction edit

What deems art as ‘true’? What is ‘true art’? Truth in art can be examined in its capacity to imitate reality, its intention and reception, authenticity and subjective value. Defining ‘true art’ is associated with other disciplines in understanding its contextual information and the narrative behind it. Art can be understood as a model to envision the world from a personal point of view. Thus the question of ‘true art’ involves a subjective view of truth. Understanding art as models of the world can also help draw conclusions on other models, such as those of scientific nature, therefore acknowledging how 'truth in art' is an interdisciplinary concern.

Reality and Truth edit

Piet Mondrian, 1942 – New York City I, oil on canvas, 199.3 cm × 114.2 cm, Musée National d’Art Modern, Paris [29]

Possibly art is most truthful when depicting reality. The correspondence theory of truth elaborates: ‘truth or the falsity of a representation is determined solely by how it relates to a reality; that is, by whether it accurately describes that reality’ [30]. How can we represent reality? Does direct imitation still omit part of reality? When art attempts to imitate reality, is it through hyper-realism, or can we also accept an abstract representation? Arguably then, artworks such as Matisse’s The Snail and Mondrian’s New York City I represent a form of reality, by interpreting visual stimulus in unexpected ways. Different artistic styles have the capacity to impart distinctive truths; the artist communicates their truth and the viewer interprets it to determine which they find truer to their own reality. Hence, can art ever be ‘untrue’? If artists and viewers decipher artworks individually, then art is subjective and cannot be ‘untrue’.

Art is a strand of aesthetic truth, which can be used to reveal other forms of truth, such as historical truth. It can inform historical events by contributing the artist’s personal understanding of reality. The artist’s response enriches the objective historical truth by providing a subjective emotional reaction. An example is Otto Dix’s Der Krieg: his representation of WWI is used in history books[31] as a visual support to historical truth. As a frontline soldier, his depictions of war horror challenge society’s heroic view of war. Without emotional artistic depictions, understanding history can be one-dimensional. Together, art and history enable a more inclusive portrayal of truth, bringing us closer to the event’s true reality.

Authenticity edit

What is meant by ‘authentic’ art? The Tate Gallery describes authenticity as ‘a term used [...] to describe the qualities of an original work of art as opposed to a reproduction’ [4]. An example of this would be the comparison between Classical Roman and Greek sculpture. With the expansion of the Roman Empire, a multitude of Greek art was introduced and favoured by the Romans, leading to the distribution of these reproductions across the empire. Thus, are they ‘art’ or just forgeries of the original? There are several factors of truth in determining authenticity, such as time period and historical, social and political context. For this reason, the true nature of art is inextricably linked to its contextual information, deducing that its authenticity is ascertained by many other disciplines.

Consequently, what is the value of reproductions? Is truth in value based on skill or authenticity? Traditional artworks continue to be remoulded within conceptual art. Wolfe von Lenkiewicz references masterpieces and significant historical figures within his work, evoking the notion that artists are inescapably influenced by art of the past. His exhibition The School of Night comments on the idea of ownership and questions whether artwork can ever be truly ‘authentic’ [5]. Similarly, transcriptions of past artworks have the ability to suggest distinct ideas to those presented within the original. An example of this is Velazquez’s Las Meninas. Picasso alone made 58 transcriptions [6] in a variety of compositions, each remarking upon something different. Are these transcriptions 'authentic'? They have reproduced the subject matter and composition of a previous work, but have created something different and original. As suggested by von Lenkiewicz [5], isn’t all artwork variations of previous work, whether an intentional transcription or not?

Intention and Institutions edit

Questions also arise when considering the truth of intention. Who decides the art’s intention: the viewer, the institution or the artist? Marcel Duchamp’s theory of readymade expands upon this, determining ‘that what is art is defined by the artist' [32]. Therefore, it is important to consider who determines the nature and intention of the artwork. Do we suppose a piece is ‘art’ because it is displayed within a relevant institution? Is art valid or true just because an institution displays it? Institutions possess asymmetrical influence over the visitors, imposing a certain vision of 'true art’ through curation and narrative. The authority they acquire is supported by the public’s confidence in them, rooted in the sociological concept of ‘institutional trust’ [33]. This idea was interrogated when two students left a pair of glasses on the floor of a San Francisco gallery as a prank, only for it to be assumed by other visitors as displayed artwork [34]. This incites conversation around the definition of ‘art’ and its intention: would we accept, for example, a simple chair as ‘true art’ if it was displayed in an art gallery, questioning its purpose or significance? Accordingly, was the spurious installation of the glasses placed by the students ‘true art’? While they had not intended it to be so, by inspiring conversation and reflection, perhaps it became ‘true art’ in its own subjective right.

Leonardo Da Vinci,1492 – The Vitruvian Man, 34 cm × 24 cm, Gallerie dell’Academia, Venice [35]

Scientific and Artistic Models edit

This discussion is relevant to understanding and developing scientific models. Both scientific and artistic models build on previous ideas and celebrate originality over reproductions. Both models are a representation of reality, providing a clearer understanding of complex ideas. However, science can be proven ‘untrue’, so are artistic models a more truthful representation of reality? The two models are distinct, as artistic models are subjective, whilst scientific ones are more objective, and their validity can be proven by evidence. This often means scientific models are valued more highly. Who determines their value? Just as artistic institutions have the authority to install hierarchy between artworks, scientific institutions determine and authorise the value of scientific models. By combining the models, we can achieve a wider understanding of our surroundings, as they each reveal truths that the other model omits. Leonardo Da Vinci’s The Vitruvian Man exemplifies this, as he drew on knowledge from both disciplines to depict what he intended to be the reality of the human form [36]. This demonstrates that applying both models and looking at their interconnections provides a more expansive perspective and thus a more truthful depiction of the subject, than studying the disciplines separately.

Conclusion edit

Conclusively, ‘true art’ is determined by its intention, authenticity and likeness to reality. Why do we need art to be true? Art as a model of the world thus claims itself to be truthful, as we then use this model to discover additional truths. This is comparable to scientific models, as they share this purpose. By taking both models into consideration, we are more likely to engage with all facets of truth, gaining a more coherent understanding of a concept. Art will always be interdisciplinary because it continually draws from other disciplines as primary inspiration and reciprocally informs these disciplines as a way of thinking.

References edit

Truth in Media in the case of Turkish Politics

Introduction edit

The influence of the newspaper press, at the present day, is indeed very great, either for good or evil. Its influence is great for good, according to its truthfulness; for evil, according to its disregard of truth” [37]

Truth is a broad and complex concept whose definition varies across disciplines. This chapter will focus on the interdisciplinary nature of truth in media studies and politics specifically in Turkey. In this context, truth means unbiased media that reflects the current politics accurately. For most of us, the rest of the world only exits through media and it has a significant role on forming our understanding of truth. The role of media is seen especially in politics. According to political scientist Harold Lasswell, mass media affects political decisions since it depicts current affaires, interprets these events, and facilitates the socialisation of citizens into their adapted lifestyle[38]. In this chapter, media studies are used to demonstrate how the lack of truth in terms of openness and accountability in news coverage, can significantly destabilise the political situation of a state.

What is Truth edit

Truth is one of main the main ideas looked at and studied by philosophers. According to certain points of view, it can not be personal and is universal, therefore it would be in opposition with opinion.

However it’s definition varies very much across disciplines. It is considered to be much more universal in hard sciences whereas in arts and humanities the definition of truth is much less restrictive as it can be more personal. According to Descartes, truth can only be reached by using the cartesian doubt, which consists in questioning anything you know until you reach an undeniable fact, whereas in pyrrhonism, truth varies with individuals because every human being has his personal truth.

In media, facts, especially numbers or statistics are often used as evidence. But it seems that they are also very easy to manipulate and used mislead and misinform a population, usually to give an advantage to a political party or a politician. The famous saying there are "lies, damned lies, and statistics" perfectly illustrates this point.

Moreover if a misinformed or uninformed population is voting, the fact that the vote is democratic is questionable

Case Study edit

Turkey’s political landscape is well summarised by Elif Shafak : “In Turkey, politics is a dangerous thing. In such a fluid, unsteady country, it is difficult, if not impossible, to predict the next month, let alone the future.”[39] After the referendum on 2017 the president gained more control and power. However, citizens had a hard time accepting the result for the referendum. Almost every election there is a controversy around the results which is due to rumours of stolen votes and the biased media which shapes peoples' political beliefs. This controversy has lead people to question truth in media and politics. The vast majority of the media are owned by allies of the president. One of the implications of this is that those who criticise the government are prosecuted on baseless allegations of abetting terrorism.[24]Thus, the opposition parties are struggling to be heard in Turkish media. Pro-Government Turkish newspaper Sabah covered President Recep Tayyip Erdogan’s point of view by 35.1% and 19 times while opposition party CHP leader’s opinions are only mentioned once (1.9%).[25] Mass media in Turkey provokes political polarisation. For instance, in 2017 referendum “Yes” voters were portrayed in a good light by means of supporting economical improvements and fighting with terrorism whereas "No"voters were accused of being terrorists siding with the plotters of the failed 2016 coup[40].

Discussions edit

Social media vs traditional media edit

Traditional media: Means of communication including newspaper, television and radio

Social media Term used in a new digital age to describe new communication methods, non-centralised [41]

As referenced above traditional media caused problems in the Turkish referendum due to the a select number of rich elites gaining ownership of media sources. With the rise of social media as a disruptive technology in the political landscape some of these problems regarding truth in media influencing politics can be contested due to the lack of centralisation with social media. However, problems remain with the use of social media in politics as some critics have debated that a limited number of social media platforms are able to provide targeted propaganda. Moreover, powerful social media sites such as Facebook, Whatsapp and Twitter have had the ability to influence the result of elections through the manipulation of facts.[42]

Political campaigns via social media have less regulation. This is because there is no third party present to fact check and provide editorial judgement in comparison to traditional media campaigns in which media broadcasters have to ensure that news is impartial, fair and true. Tambini states that in some social media political campaigns there is misinformation and therefore a lack of truth due to the lack of a regulating body. Moreover, there have been questions of truth in social media due to the rise of the term Fake News and the threat of foreign intervention in political campaigns in which through the use of 'psychometric profiling' certain users were heavily targeted by a political campaign. A study by Silverman highlighted that fake news were more highly circulated on Facebook than popular mainstream news stories, often 'fake news' favours a certain candidate therefore reinforcing the problems within the Turkish referendum.[43]

Post-Truth politics in Turkish media edit

Caricature made by political cartoonist Carlos Latuff. President Erdogan is seen 'silencing' the press

Post-truth" is an adjective “denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief”.[44]In recent years, Turkish president Tayyip Erdogan has been using mass media to promote a 'post-truth narrative' in order to appeal to the people and consolidate a position of absolute power.[45]This narrative is strengthened by populist discourse (i.e. that the will of the people is absolute and "what is 'popular' must also be good or true”[46]), dominating news coverage. According to the second condition of our definition of truth (i.e that for a statement to be true it must be supported by adequate evidence and argumentation), “post-truth” populist statements must be classified as non-truths due to their subjective nature. In a speech sensationalised by a pro-government newspaper, president Erdogan states: “. . . Who cares if you are an artist, a professor? First you will respect the people; you can never look down on the people."[47] This shows a strong sense of opposition to the expertise of “an artist” or “a professor”, while a greater weight is given on the importance of “the people”. Given that expert opinion in politics is usually based on facts and justifiable argumentation [28] , an attempt of the media to undermine expert critique leaves people susceptible to manipulation by non-truths. As a result, “the ability of our society to make decisions in the public interest”[28] is seriously harmed: it hinders adequately informed judgements about a political situation. Professor James Pfiffner describes this "adherence to demonstrably false statements" as a phenomenon which "strikes at the very heart of democracy".[48] Equality, one of the basic foundations of democracy, is severely undermined. Once the people detach their opinions from factual evidence, greater ability to manipulate the truth is given to those in power instead.

Conclusion edit

In conclusion:

  • concept of truth in both mainstream and mass media is highly relevant for what we consider truth in democracy
  • interdisciplinary media and politics as truth in media impacts truth in politics

References edit

  1. politics | Definition of politics in English by Oxford Dictionaries [Internet]. Oxford Dictionaries | English. 2018 [cited 1 December 2018]. Available from:
  2. post-truth | Definition of post-truth in English by Oxford Dictionaries [Internet]. Oxford Dictionaries | English. 2018 [cited 1 December 2018]. Available from:
  3. Pashkova V. Arendt's Political Thought: The Relationship Between Truth And Politics [Ph.D]. Western Sydney University; 2016. [cited 1 December 2018] Available from:
  4. Arendt H. Between past and future. 1st ed. New York: Viking Press; 1968. [cited 1 December 2018] 227.
  5. Pashkova V. Arendt's Political Thought: The Relationship Between Truth And Politics [Ph.D]. Western Sydney University; 2016. [cited 1 December 2018] Available from:
  6. Arendt H. Between past and future. 1st ed. New York: Viking Press; 1968. [cited 1 December 2018] 227.
  7. Arendt H. Between past and future. 1st ed. New York: Viking Press; 1968. [cited 1 December 2018] 228.
  8. The Electoral Commission. EU Referendum Results. [cited 29 November 2018] Available from:
  9. The Electoral Commission. Electoral Commission designates ‘Vote Leave Ltd’ and ‘The In Campaign Ltd’ as lead campaigners at EU Referendum. [cited 29 November 2018] Available from: [cited 29 November 2018]
  10. Johnson B. 15th September 2017. My vision for a bold, thriving Britain enabled by Brexit. The Daily Telegraph. [cited 29 November 2018] Available from:
  11. The UK Statistics Authority. UK Statistics Authority statement on the use of official statistics on contributions to the European Union. [cited 29 November 2018] Available from:
  12. The BMA. 15th November 2018. EU Survey 2018. [cited 8 December 2018] Available from:
  13. Davies M. Managing challenging interactions with patients. BMJ [Internet]. 2013 [cited 8 December 2018];:f4673. Available from:
  14. Ovens H. Part I: the difficult patient: medical and legal approaches. Can Fam Physician. [Internet]. 1989;35:1797-802. Available from:
  15. Inflammation; 'Difficult' patients increase doctors' misdiagnosis risk regardless of case complexity. NewsRx Health 2016 Apr 03:8. Available from:
  16. Mamede S, Van Gog T, Schuit S, Van den Berge K, Van Daele P, Bueving H et al. Why patients’ disruptive behaviours impair diagnostic reasoning: a randomised experiment. BMJ Quality & Safety [Internet]. 2016 [cited 8 December 2018];26(1):13-18. Available from:
  17. Lerner J, Tiedens L. Portrait of the angry decision maker: how appraisal tendencies shape anger's influence on cognition. Journal of Behavioral Decision Making [Internet]. 2006 [cited 8 December 2018];19(2):115-137. Available from:
  18. Mamede S, Van Gog T, Schuit S, Van den Berge K, Van Daele P, Bueving H et al. Why patients’ disruptive behaviours impair diagnostic reasoning: a randomised experiment. BMJ Quality & Safety [Internet]. 2016 [cited 8 December 2018];26(1):13-18. Available from:
  19. Chronic Pain: Symptoms, Diagnosis, & Treatment | NIH MedlinePlus the Magazine [Internet]. 2018 [cited 9 December 2018]. Available from:
  21. Artificial intelligence (AI) in healthcare and research. London, UK: Nuffield council on Bioethics; 2018.
  22. Bringsjord, Selmer, Govindarajulu, Sundar N. Artificial Intelligence [Internet]. 2018 [cited 7 December 2018]. Available from:
  23. Griffiths S. The big ethical questions for artificial intelligence (AI) in healthcare – Nuffield Bioethics [Internet]. Nuffield Bioethics. 2018 [cited 7 December 2018]. Available from:
  24. a b c d Pinnaclecare: The human cost and financial impact of Misdiagnosis. White paper [internet]. 2016 available from: Invalid <ref> tag; name ":0" defined multiple times with different content
  25. a b c Graber ML, Wachter RM, Cassel CK. Bringing Diagnosis Into the Quality and Safety Equations. JAMA.2012;308(12):1211–1212. doi:10.1001/2012.jama.11913. Available from: Invalid <ref> tag; name ":1" defined multiple times with different content
  26. Develin K, Smith R. One in six NHS patients 'misdiagnosed'. The telegraph [internet]. 2009. Available from:
  27. Carter mw , et al. Nihgov. BMJ [internet]. 2014[Accessed 9 December 2018].Available from:
  28. a b c d Graysons. What Are Really The Top Misdiagnosed Conditions In NHS Hospitals In 2014/15?. Graysonscouk. [internet]. 2015 [Accessed 3 December 2018].Available from: Invalid <ref> tag; name ":2" defined multiple times with different content
  29. Wikidata contributors.Q1871066. Wikidata. 19 November 2018. Available from: [Accessed 7 December 2018]
  30. David M. The Correspondence Theory of Truth. Stanford Encyclopedia of Philosophy. 10 May 2002. Available from: [Accessed 24 November 2018]
  31. Ivernel M., Villemagne B. Histoire-Géographie 3e. p20. Hatier Parution. 2016.
  32. The Tate Gallery. Authenticity – Art Term. The Tate Gallery. Available from: [Accessed 1 December 2018]
  33. Wikipedia contributors. Institutional trust (social sciences). Wikipedia: The Free Encyclopedia. Available from: w:Special:PermanentLink/815960113 [Accessed 7 December 2018]
  34. Mele C. Is It Art? Eyeglasses on Museum Floor Began as Teenagers’ Prank.  30 May 2016. The New York Times. Available from: [Accessed 29 November 2018]
  35. Wikidata contributors.Q21548. Wikidata. 20 November 2018. Available from: [Accessed 7 December 2018]
  36. Zöllner F. Leonardo. Cologne: Taschen. 2015.
  37. Truth in Journalism. Scientific American [Internet]. 1853 [cited 6 December 2018];8(46):365. Available from:
  38. Influence of mass media [Internet]. 2018 [cited 6 December 2018]. Available from:
  39. Shafak E. In Turkey, politics is a dangerous thing [Internet]. POLITICO. 2018 [cited 3 December 2018]. Available from:
  40. 5. Simsek G. FRAMING OF TURKISH LEADERS’16 MAY 2017 REFERANDUM NEWS IN TURKISH MEDIA. Gumushane University Communication Faculty e-Journal [Internet]. 2018 [cited 8 December 2018];6(1):356-380. Available from:
  41. Definition of media [Internet]. 2009 [cited 9 December 2018]. Available from:
  42. Allcott H, Gentzkow M. Social Media and Fake News in the 2016 Election. Journal of Economic Perspectives. 2017;31(2):211-236.
  43. Tambini D. Digital Dominance The Power of Google, Amazon, Facebook, and Apple [Internet]. 1st ed. New York, NY: Oxford University Press; 2018 [cited 9 December 2018]. Available from:
  44. post-truth | Definition of post-truth in English by Oxford Dictionaries [Internet]. Oxford Dictionaries | English. 2018 [cited 6 December 2018]. Available from:
  45. Zeynalov M. Trump, Erdogan And Post-Truth Politics. Huffington Post. 2016;.
  46. McMullin E. Opinion | Trump's Rise Proves Populism Is Democracy's Greatest Threat [Internet]. NBC News. 2017 [cited 7 December 2018]. Available from:
  47. Haberleri G. Cumhurbaşkanı Erdoğan'dan o akademisyenlere sert tepki! (Translation: President Erdoğan reacts harshly to those academics!) [Internet]. Sabah. 2016 [cited 7 December 2018]. Available from:
  48. Pfiffner J. Trump’s lies corrode democracy [Internet]. Brookings. 2018 [cited 7 December 2018]. Available from:

Truth in Franco's Regime

How do art and literature express the truth about Franco’s regime?

This chapter discusses the relationship between notions of truth in politics, art and literature. We will analyse the political conflict that arose during the Spanish Civil War and the way that it was expressed by Pablo Picasso's painting "Guernica" and the book "Homage to Catalonia" by George Orwell.

Assessing truth in Arts and Politics can be difficult as artistic truth is often considered to be fully subjective. Interdisciplinary approach allowed us to appreciate the conflict of conceiving truth in Arts and Politics, discriminating between the objectively true aspects of their accounts, and acknowledging how the biased messages help us picture the atmosphere during Franco's regime.

Introduction edit

Art and Literature can be used as tools to reconstruct the truth in past events. This task becomes more challenging when controversial political situations are involved. Between 1939 and 1975 Spain was ruled by a fascist dictatorship directed by the General Franco [1]. In 1936 the Popular Front, a leftist coalition, came to power in Spain.[2] After their election, Franco organised a coup in Spain that triggered the Spanish Civil War.[3]

During the war, artists and authors tried to expose the truth about Franco's regime. Artists that resided in Spain were limited by political censorship and therefore could not openly express anti-fascist messages.[4] However, Orwell and Picasso did not face censorial limitations because they worked outside Spain. Both artists were Leftists and opposed to Franco’s regime.

Subjectivity is a product of one's perception, culture and environment. Objectivity is independent from human biases.[5] What will follow is an application of the latter concepts in the analysis of Guernica and Homage to Catalonia.

Assessment of truth in Art edit

"Art is a lie that makes us realise the truth” Pablo Picasso

Art is one of the best ways to have a lasting influence over time. During Spanish civil war, Pablo Picasso expressed his outrage against Franco's regime with Guernica. This enormous mural-sized painting was created in 1937 and became one of Picasso's most famous artworks. The painting is related to the bombing of Guernica executed by Hitler's German bombers, allied with Francisco Franco. This bombing almost destroyed the entire city and killed more than a thousand civilians.[6] This fascists' attack was a consequence of the Basque Nation’s attempt to introduce an independent democratic government. Through his painting at the Paris Exhibition in 1937, Picasso denounced the massacre to the entire world.[7] Guernica became a symbol denouncing Franco’s fascist violence.[8]

Objectivity edit

Objective truth can be found in the painting through the symbols and scenes. Picasso used shocking scenes such as a mother holding her dead child. The powerful pain represented in the painting reflects the one felt by the Basque nation. Furthermore, Picasso used a lot of symbols in the painting and never explained what they represented. Most opinions thought that the bull characterised Franco, the horse was the Spanish nation, the mother and her dead child reflected the innocent civilians and the different parts of bodies found everywhere in the painting represented the city's destruction with its population.[9] All these symbols represent the real actors and consequences of the war which is proof of objectivity.

Moreover, since the painting was not censored by the French government[10], it allowed Picasso to show the plain truth without any barriers. The non-censorship greatly helped Picasso depict the truth.

Subjectivity edit

Nevertheless, there is an important subjective truth in this painting due to Picasso's biased perspective of the event. Firstly, Spain is Picasso’s home country which probably affected him deeply and urged him to depict the bombing as a slaughter. Moreover, the artist did not assist the tragedy, he came to know about the events from newspapers [11] in Paris. The painting was inspired by the pictures that reached Paris and manifestations of thousands of people protesting in the city against the disaster.[12] Picasso's work was based on his interpretation from the newspapers which is why the truth told is subjective.[13]

Finally, Picasso’s political beliefs may also explain why the painting is subjective. In fact, he stated, « No, painting is not made to decorate apartments. It's an offensive and defensive weapon against the enemy ». According to Picasso, art is used to send political messages. Indeed, the artist was politically engaged in the French Communist Party and was firmly opposed to fascism. Picasso's political bias also expresses the subjectivity of Guernica.[14]

Assessment of truth in Literature edit

"If liberty means anything at all, it means the right to tell people what they do not want to hear"- George Orwell

Orwell wrote Homage to Catalonia after volunteering as a soldier in the POUM, the Marxist Spanish Revolutionary party, during the Spanish Civil War.[15] He documented his experiences volunteering in the Aragonian war, emphasising on the peculiar conditions faced by the soldiers in the trenches. After the POUM was made illegal in 1937, Orwell was forced to run away.[16]

Objectivity edit

Experiencing the war and writing in first person allowed Orwell to describe the objective truth about the atmosphere in Spain during the revolution. He started writing Homage to Catalonia immediately after leaving the front which gave him the capacity to remember details and express a more precise truth on what he had just lived.

Moreover, the publication of Homage to Catalonia in England was not limited by censorship guidelines.[17] Therefore, the accuracy of Orwell’s original report is likely to be truthful and objective because there were no regulations in England protecting Franco's Empire. This book was successfully published in England in 1938, one year after his return.[18] Homage to Catalonia was published in Spain for the first time in 1969, after a number of rejections and amendments.[19]

Subjectivity edit

Even though Orwell’s aim was to tell the truth he admitted that art was linked to politics: «no book is genuinely free from political bias».[20] He was politically biased because he was committed to the socialist party and fought on the side of the POUM. Orwell himself acknowledged the limitations of his reportage "my partisanships, my mistakes of fact and the distortion inevitably caused by my having seen only one part of the events". The author's bias is one of many limitations to the accuracy of the truth.

This may have resulted in a form of implicit self-censorship described by Abellan as the response to his personal, social and historical constraints.[21]Historians have attempted to find ways to overcome the limitations of literature when reconstructing past events.

Conclusion edit

Artistic truth can be considered as propositional truth. Propositions are "locus of truth" found in an art piece.[22] To assess the validity of these loci it is necessary to understand the historical background of the artists, the events that inspired their work, the policies regulating their publication, as well as the artistic processes behind the creation of their art.

Crossing the borders between artistic and political disciplines allowed us to appreciate what is considered true behind the events associated with the Spanish Civil War.

Sources edit

Imperialism and IMF

Introduction edit

Wall writing against the IMF in Greece, by Georgetikis

The International Monetary Fund (IMF) is an organisation founded in Washington, D.C., providing loans and financial guidances to countries in economic crisis.[23] Arguably the IMF's conditionality of loans, including structural adjustments, has imperialist characteristics, which caused unwanted reforms or large amounts of debts. Regarding this, usually, we limit our scope to economics, while politics and sociology also take major parts. Therefore, this chapter will analyse the imperialistic ideas of IMF during 1997 Asian financial crisis and Greek government-debt crisis through an interdisciplinary approach; which will enhance the evaluation of IMF.

IMF in Media edit

The portraits of the IMF in mass media are often shadowed with criticisms. A documentary film, Life and Debt, reveals the reforms imposed only brought Jamaica debts. The former Prime Minister accused the IMF's policies of undermining the sovereignty of many nations which suffered from colonisation [24].

1997 Asian Financial Crisis edit

The Asian crisis caused global panic, so IMF intervened and provided bailouts for severely affected nations to restore confidence.

Economics edit

Criticism pinpoints that IMF exploited the crisis to conduct its economic agenda in Asia. In Korea, IMF helped bailout financial agencies and external lenders while requiring low inflation and other reforms including deregulation and capital liberalisation.[25] This raised local sentiment stating the programs were manipulated by America to benefit their industries.[26]As a consequence of the reforms, the stock market decreased by more than 40% and the value of its currency slashed by more than 50% compared with the year before.[27] Similar effects could also be found in other Asian countries. The contraction of the Thai economy deepened and in Indonesia, the rupiah kept falling and the economic scene further deteriorated.[28]

The role of IMF was reconsidered by economists. Feldstein emphasised the significance of autonomy in the domestic institutions which should not be subordinate to international agencies[29]. Fischer stated IMF should endeavour to help with key problems which lie in the crisis instead of rushing to structural adjustments.[30] Tabb argued although in the long-term these westernised reforms might bring economic growth, the shakeout during the transition could be destructive.[27]

Politics edit

IMF's Political favouritism was evident during the Asian crisis. Given the U.S. holds 17% voting power, U.S. allied nations like Indonesia were prioritised[31]. Hence, IMF’s neoliberal programs and policies are tailored to the American government's wants and countries with friendly relations with the U.S. will be able to bend the conditions of the structural adjustment.[32]

Indonesian president Suharto's pro-America tendency made IMF turning a blind eye on his notorious patronage system and nepotism which precipitated the crisis. IMF required Indonesia to eliminate subsidies and tax breaks to various monopolies owned by Suharto’s family. This political flaw led the IMF’s rigid conditions to fall off-track in Indonesia co-occuring with the increasing macroeconomic turmoil and financial system collapse[33]. Moreover, policy changes from above like trade liberalisation and privatisation make the country more susceptible for multinational corporations to exploit. For instance, Indonesia becomes an export-oriented market and vulnerable to price wars.[34]

Sociology edit

Sociologists criticise the IMF's interventions in Asian Crisis by analysing the ideological construction of this institution. Sarah Babb argues the IMF “[blindly promotes] free markets and its harsh austerity measures”[35], a huge diversion from Keynes' original ideas about the IMF. Flawed multilateral agreements prompted "‘slippage’ [in the direction of the IMF] over time”[35], suggesting members with arguably the most power can shape the intergovernmental organisations’ polices in their favour and cripple countries to bolster their economies. During the Asian Crisis – the SAPs were radically laissez-faire and insinuates the US's authority on the conditionality of loans, since the economic meltdown coincided with the economic legacies of the Reagan era. The "mission creep" of IMF, the expansion of objectives beyond its original targets, can be owed to the nature of the organisation. When an institution such as the IMF depends on nations for resources[36], its primary intent is deviated.

2010 Greek Government-debt Crisis edit

The long recession in Greece left it with excessive debts, political disputes and social issues. To prevent contagion, rescue packages were launched by the Troika (European Commission, European Central Bank and IMF), conducting austerity measures as preconditions.

Economics edit

The Troika lent nearly US$440 billion of loans during 2010–2015, though it culminated in the drop in GDP and more severe lasting debt burden. Greece's GDP dropped by 25% and its debt-to-GDP ratio rose from 127% in 2009 to around 170%.[37] The IMF also admitted having underestimated the damage the fiscal consolidation policies brought.[38] Economists reveal the bailouts simply transfer the debts instead of truly fixing the problems of Greece.[39] Studies indicate the loans were actually used to pay the previously piled debts, rescue private banks which were subordinated to other European banks, or compensate European investments, and no more than $8 billion went to Greek populace[40]. Meanwhile, the austerity measures drove the government to undercut the wage and money in businesses. Therefore, economic analysts, like Rasmus, claim 'an emerging new financial imperialism' behind the ideology of neoliberalism, meaning within a union, the underprivileged states' autonomy on their currency, fiscal expenditure is undermined, and then turn to be 'economic protectorates' as in the case of Greece.[41]

Politics edit

Greece and the Troika, cartoon by Carlos

In the Greek crisis, the Troika must agree on certain policies. While IMF is a technocratic institution – though, not perfectly immune from political bias – the EC and ECB consist of politicians representing countries, with potentially different interests and aims [42]. They differed on the realism of some economic projections, including the nexus between growth and government budget[43]. However, the 27 EU members possess over 32% of voting power, and the managing director of IMF has always been a European [44]. Hence, the IMF’s involvement was approved speedily. Despite the slight differences in their objectives, their close link may weaken IMF’s role, increase political leverage and pressure on Greece[44]. This suggests the political impact heavily sways the decision making[45].

Sociology edit

Sociologists propose members of intergovernmental organisations tend to force their norms on a global level, established as global conventions; hence the IMF’s original objectives, to prevent economic crisis from spreading, are digressed.[46] In the Greek case, austerity measures and reforms which required by the Troika included wage freeze/wage cuts on public sector workers, raised the retirement age, taxes and privatisation. This caused sharp increases in unemployment, cuts in welfare services disturbed social and healthcare services. This neoliberal reconstruction of the Greek economy, was forced, and did not take the social model of Greece into account. Therefore, social health care systems were severely overwhelmed because of the drastic fall in health expenditures[47]. Although the Greek economy has somewhat returned to a state of normalcy from a deep recession, its sufferings overweigh.[46]

Conclusion edit

The Fund is the cause and symptom of imperialism. Therefore, its role as assistance should be retraced and it should consider the unique contexts of supported nations, rather than imposing cookie-cut policies [32].

When analysing real-life issues like the IMF, as disciplines are co-dependent, we could benefit from adapting interdisciplinary approach which provides multidimensional insights.

References edit

Imperialism in Fashion

This chapter focuses on fashion’s role in different disciplines and the extent of imperialistic influence in the disciplines of interest. It will be presented in four sections: introducing fashion as an interdisciplinary academic field, then zooming at its scope in anthropology, politics, and economics.

Fashion as an Interdisciplinary Academic Discipline edit

A young discipline of 30 years, fashion lacked academic interest because of its regardance as a subcategory of material culture. However, throughout the years fashion aroused curiosity among scholars and became a topic of research because the abundance of methodological and theoretical frameworks that fashion contains, and the interdependency of fashion and other disciplines to expand knowledge. Accordingly, examining fashion from an educational lens helps in terms of providing insight into the interdisciplinarity of it. Holly M. Kent’s Teaching Fashion Studies and Heike JenssFashion Studies: Research Methods, Sites and Practices are seminal works about fashion studies that both prove the interdisciplinarity of the methodology to study fashion by providing subject-specific examples. For instance, the section in the former book, “Analyzing the Social Functions of Dress in Different Historical Eras” highlights the links between the fashion choices of societies in history and how they attain knowledge about gender, race, sexuality, and identity.[48] Similarly, the latter book includes a section about the significance of fashion’s role in ethnographic research by drawing personal experience from Dr. Christina Moon [49].

With the rapid growth of the fashion market, fashion education expands its popularity among numerous students worldwide who want to go into fashion for higher education. Due to increasing demand in fashion, Western institutes like Condé Nast College of Fashion and Istituto Marangoni launched programs in India and China to incorporate fashion education to their fast-growing economies.[50] Although the first fashion program in China, launched in China Academy of Arts and Crafts, aimed at manifesting the idea of "Chinese Dream" [51], it could be argued that Asian branches of Western schools try to implement their fashion into Chinese trends and take place in the Chinese market.

Aesthetic Politics and Modest Fashion edit

Aesthetic politics is a term associated with fascism, to describe the use of propaganda to change the views of the public. In recent years it has returned to popular politics with Trump's election[52]. His rhetoric against muslims has fed into the debate against terrorism, and fuelling hostility and prejudice against islam. This has partly influenced the political debate on muslim women's religious freedom, integration to Western-dominated societies, and fashion choices. While some claim the wear of extremely covering hijabs such as the niqab is unethical because of the oppressive connotations, many muslim women find it empowering. Especially muslim women who have integrated non-muslim societies are proud of the vocation and courage it takes to stand out. Their experience in Western cities is filled with prejudicial hostility tied to people's views on terrorism [53][54]. Additionally, the economic power of financially independent muslim women is rising. Called Generation M, they demand higher representation and a wider selection of fashion choices [55]. While many muslim women who choose to wear hijab see it as a religious garment that should not be decorative nor captivating, some have started finding new exciting ways to wear it. Accordingly, brands like Nike have embraced the "modest fashion" trend by hiring hijabi models and designing pieces targeted towards muslim women. The controversiality stems from the fact that the hijab aims to hide and divert interest, but modest fashion rebels against that and lifts it to the level of high fashion, making it a commercial product [56].

Anthropology in Fashion since Colonialism edit

Prometheanist is a fresco mural done by Jose Clemente Orozco in 1924 Mexico City, Mexico. The image depicts the colonisation of Mexico by the Spanish as a colonizer stands above the body of a victim of violence.

Anthropology, as the study of human societies, identifies fashion as complex material culture, relevant to civilization's ethnographic understandings as a way of deriving status, social cohesion, and ritual. Fashion has been a constant in societies and can be analysed during colonialism and post-colonialism.

Mexico during the colonialist period in which the interpretation of fashion varies between cultures. This could be observed by the work of Jose Clemente Orozco which documents the interpretation of the Spanish steel armor, representing the soldier in a machine-like, dehumanized way as he stands on the body of an indigenous person.[57] In contrast, the Spanish viewed the armor as a symbol of social cohesion and status as colonialists.

The exchange of material culture during colonialist times resulted cultural translations in the form of appropriation and homogenization.[58][59][35]

A rise in consciousness of cultural appropriationas a result of post-colonialist hegemony of dominant versus minority cultures was seen in the 21st century. This infringement of the collective intellectual property rights was recently documented in the case of the Navajo Nation v. Urban Outfitters due to copyright infringement of their Pendleton designs in 2012 in which Urban Outfitters ultimately won on the basis of fair use in 2016.[60]

Fashion is gradually becoming homogeneous in a global scale due to the fast and inexpensive production of clothing and Western influence.[61] The macro culture of haute couture allows people to accessibility and customization of Western fashion. This has resulted in the documentation of decreasing national fashion and the increase of the westernization of clothes, for example, in Meiji Japan as Japanese society moved from an isolated feudal society to a Westernized form.[62]

The Socio-economics of Fast-Fashion Industry edit

The significant diminution of protectionism and the opening of countries to free trade has allowed Western multinational companies such as Zara or H&M to establish themselves globally. For example, Inditex – the parent company of Zara, Zara Home, Bershka, Massimo Dutti, Oysho, Stradivarius, Pull & Bear and Uterqüe – has now over 5,900 stores in more than 85 countries.[63] Their omnipresence and low-cost products have given rise to a new way of consuming: fast fashion.[35] Fast fashion, today's prevailing production method, has two main characteristics. Firstly, production and distribution must be carried out in a very short period of time. Secondly, clothes must be very fashionable, often meaning imitating luxury brands.[64]

The Less Economically Developed Countries (LEDCs) are currently dominated by the industrialized countries, this is called economic neocolonialism.[65] In the fast fashion sector, this is demonstrated by the place of countries in the production process. To satisfy consumerism and therefore produce more at ever lower costs, fast fashion companies have as their main strategy offshoring. Indeed, if the design of clothes is conceived in Western countries, they are then produced in countries like China or Bangladesh, the two largest textile workshops in the world.[66] Multinational firms that divide their production process this way rarely take into account the interests of the nations in which they operate. This leads to the multiplication of sweatshops, where working conditions are extremely poor as shown by the numerous fires or factory collapses. In April 2013 in Dhaka, Bangladesh, more than 1,000 workers died in the Rana Plaza collapse, an eight-storey building.[67][68]

Sources edit

Imperialism in the study of children’s toy preferences

This piece will discuss the legacy of imperialism within the interdisciplinary field of gender studies, using research into children’s toy preferences as a case study. The majority of research in this area focuses on early child development in Western countries; little material from non-Western countries exists, or any cross-cultural comparison[35]. It can be argued that this gap in research across disciplines is a result of imperialism and that the geographical factors that influence the biological and social determinants of children’s toy preferences are a legacy of imperialism.

Research into children's toy preferences is used here as a case study because it illustrates how gender differences present themselves before the influence of social factors, and how these social factors then interact with existing biological factors. Although there is an argument that as biological factors influence gender development, and therefore geographical differences will be insignificant in terms of this, the current consensus is that gender is a product of the interrelation between biological and social factors. However, this cannot be seen as universally conclusive when only Western societies have been investigated. Thus, a re-evaluation of the influence of Western bias in this area, and in all disciplines, is required so that the possible effects of geography on biological and social factors can be taken into consideration in order to achieve a comprehensive understanding of children’s toy preferences globally.

Imperialism’s influence on research edit

Sociological and psychological research has, thus far, primarily focused on Western, Educated, Industrialised, Rich and Democratic (WEIRD) societies[69][70], despite members of these societies not representing humankind as a whole[71]. Consequently, if the possibility that culturally specific findings might be misattributed as universal traits is not considered, this will adversely affect the scientific defensibility and reliability of theories[71].

Why does research conducted in the West primarily focus on WEIRD societies? Though early forms of higher-education emerged in the Arab world, science flourished in the West after the twelfth century[72]. During the economic growth of cities in the 13th century, universities were founded across Europe by emperors who were seeking to expand their influence and rival other universities’ influence[73].

European universities began to expand globally in the 16th century, with the intention to provide Western education for the colonists. In the 19th century, more European-style universities were founded in non-Western societies, often funded by non-Western societies in order to educate their own people in Western scientific methods[73]. Therefore, it can be argued that the current Western bias in global science is caused by the imperialist history of knowledge production.

While academia proliferated in Europe, Eastern countries continued to focus education on government-centred aims. For example, the imperial examination (keju) in ancient China was a civil service examination system designed to select the best potential candidates to serve as administrative officials[74]. Scientific research and innovation existed within governmental organisations only, meaning there was limited dissemination of knowledge. During the colonised era (1840–1949) in China, universities and independent academic research centres including Tsinghua University were founded, largely influenced by imperialism[75].

Biological basis for gender difference edit

Towards the end of the 20th century, there was an emergence of research into the biology of gender and gendered behaviours, separate from previous research which did not distinguish gender from sex. The volume of research in this area has increased over time, and now, the understanding of the biological basis for gender is established. However, questions remain as to the extent of this influence, and how it relates to social factors during childhood development.

Research in neuroendocrinology suggests that levels of prenatal and neonatal exposure to testosterone are responsible for the development of the brain as either ‘male-typical’ or ‘female-typical’ through ‘permanent neural changes’[76]. The prevalence of prenatal and neonatal testosterone has been linked to children's toy preferences; young girls with congenital adrenal hyperplasia (CAH), who therefore produce more testosterone beginning in utero than unaffected girls, spent ‘significantly more time’ playing with ‘boy’s toys’ than unaffected girls. This suggests that the early presence of higher levels of androgens (including testosterone) leads to gender differences in behaviours[77]. A 2017 paper by Todd et al. collates research into children’s toy preferences and concludes that gender difference in children’s toy preferences does exist. Though this appears to be a result of biological and social factors, the biological basis for gendered behaviour cannot be dismissed because of the consistency in finding gender differences in toy preference across many different studies[35].

The majority of research in this area is conducted in Western societies on Western children, a limitation that Todd et al. acknowledge. It can be argued that geographical difference in this instance should not matter, as biologically all humans are fundamentally the same, and suggesting differently could risk veering into the imperial legacy of eugenics. However, with the increase in research into epigenetics, it is known that environmental factors do influence gene expression, which may lead to gender differences in behaviours. Therefore, interdisciplinary research into differences in the sociology of Western and non-Western countries, and to what extent, if any, this has on the biology of gender differences is required.

Social basis for gender difference edit

Social sciences arose from philosophy and science[78], and so the influence of imperialism was still present: these new disciplines aimed to solve problems in the West such as those around capitalism and urbanisation. Psychological and sociological studies on children’s toy preferences acknowledge the existence of biological factors in gender but do not explore them[79], staying within disciplinary boundaries instead of initiating interdisciplinary research. Considering both close environmental influences and representation of toys the most relevant factors in children’s toy preferences are parents, peers, exposure to toys, and verbal and visual messages. These factors vary cross-culturally with variation in family structure, parenting principles, and media exposure.

Parental influence mainly consists of the toys chosen for the children – parental encouragement or discouragement does not have a significant influence[80]. For example, 12-month-old infants primarily showed interest in gender-stereotyped toys, then, secondarily, in toys which they were familiar with[81]. Social learning theory[82] suggests that parents can also create a bias by rewarding the ‘right’ toy choice, which is usually the gender-stereotyped one. Children’s peers have a greater influence on their toy preference than surrounding adults; in nurseries, peer influence may cause children to choose more gender-stereotyped toys[83]. Therefore, the importance of peer influence makes the difference between Western and non-Western socialisation customs more significant. Exposure to counter-stereotyped images[84] and models[85], such as TV shows and cartoons, was effective in encouraging children to be more open-minded in toy preference, and geographic variation in media should, therefore, be accounted for.

Conclusion edit

The historical legacy of imperialism on knowledge production and imperialist influence on geographical factors, responsible for the misattribution of Western characteristics to all people, have impacted the study of children’s toy preferences across disciplines. The bias of all research into children’s toy preferences, carried out in WEIRD countries, must be deconstructed to better fit in with our post-colonial world. This could be achieved by viewing research critically: questioning the history of the geographical context in which it was carried out, and investigating the influence of these geographical factors on the biological and social determinants of children’s toy preferences.

Imperial Influences on African Education Systems

Preface edit

The Berlin Conference (1884–1885), the formal beginnings of the Scramble for Africa.

European colonists have a long history of imperial influences in Africa, spanning from the Scramble for Africa in the late nineteenth century to the post-war, decolonisation era to the present day.[86][73] This chapter examines how European colonists changed the education systems in its African colonies and explores the lasting impact that this influence had on the region. We begin with an introduction to traditional African education, followed by a summary of how colonial education developed from a form of missionary work to a means to racial subjugation and economic exploitation. Furthermore, we discuss how academic imperialism was informed by the disciplines of culture, economics, and politics, exemplifying it as an interdisciplinary issue.

Traditional African Education edit

Traditional African education consisted of developing skills and values that would help youth become self-sufficient and productive members of the community. While the specific skill sets that were taught varied between tribes, children generally learned about agriculture, religion, moral principles, and social life within the community.[87] For example, children of the Bamba tribe in Northern Rhodesia could name fifty to sixty species of trees by the age of six, as it was a society based on "cut and burn" agriculture.[88] There was no distinction between manual and intellectual education.[87] Education did not occur in formal institutions; rather, skills were taught through experiences and knowledge passed down from elders. Thus, responsibilities for education fell largely to the family in early development and shifted to the larger community in later stages of life.[87]

The Development of Colonial Education Systems edit

"Colonial schooling was education for subordination, exploitation, the creation of mental confusion... an instrument to serve the European capitalist class in its exploitation of Africa. – Walter Rodney[89]

The division of colonial Africa amongst imperial powers in 1913      Belgian Empire      British Empire      French Empire      German Empire      Italian Empire      Portuguese Empire      Spanish Empire      Independent

Pre-World War I: Missionary Education edit

The interference of Western actors in African education was primarily initiated by Christian missionaries. Early colonists opined African beliefs as fictitious and a form of witchcraft.[88] They established missionary schools to correct their perceived misbeliefs and convert Africans to Christianity. Such conversion was a method to "civilise" the African people, which was thought to be both necessary and beneficial to African society.[87][71] In missionary schools, very little was taught about African culture and their contribution to society. Instead, children were taught to suppress their culture and nationalism and to idolise their European counterparts, allowing Europeans to assert social and economic superiority.[89][71]

Post-World War I: Centralised Education edit

After the First World War, the focus of education in African colonies shifted from religious subjugation to economic optimisation.[88] Colonists utilised education to facilitate the exploitation and domination of the continent by preparing a paucity of Africans to work low-ranking jobs for the local government or for private European companies.[88] By doing so, colonists were able to maximise economic productivity in the colonies to fund the capitalist pursuits of the Empire.[90] The placement of Africans in the government also allowed colonists to exercise a greater degree of control over the colony. A native elite class, educated with Western ideals, would better satisfy the general population while be more effective at propagating acceptance of the colonial hegemony.[89] Despite the vast wealth generated by the colonies, their education systems were poorly funded. In 1935, 2% and 3% of the gross national income (GNI) was spent on education in British colonies Nigeria and Kenya, resulting in just 0.2% of African youths receiving higher education.[88]

Decolonisation: From Subjugation to Self-governance edit

After Second World War, it was agreed upon by the international community that imperial powers should work towards the gradual decolonisation of their imperial possessions. Thus, education in African colonies shifted to a curriculum that promoted the production of a self-governing nation.[88] Since then, the overwhelming control of Western ideals on education in Africa has largely diminished, although certain groups believe the imposition of Western academia in Africa still exists (e.g. through globalisation and foreign aid).[35]

The Informing Disciplines of Academic Imperialism edit

Culture edit

Africa is presently dominated by European languages – namely French, English, and Portuguese.

Racist ideologies and the propagation of cultural superiority were deep-rooted within colonial schooling.[88] Prior to the implementation of British schooling, African children were taught to appreciate the history and culture of their tribe. They were taught valuable skills and customs that were unique to their community, and such knowledge would be passed onto future generations by elders in the tribe.[87] Thus, indigenous education was very much intertwined with indigenous identity. The stigmatisation of indigenous pedagogies – and, by extension, indigenous culture – has led to the loss of languages, religions, and history.[35] African youths feel that knowledge of their own culture has little use in a world dominated by Western society. For example, the majority of Africans speak a European language, such as French or English, and indigenous languages are becoming increasingly unpopular.[90] Young Africans were further disadvantaged from their lack of understanding of their own heritage, making them feel alienated from their community.[88]

Economics edit

Human capital was mainly defined by its significance to GNI.[73] With this in mind, global education organisations such as the World Bank primarily focused on teaching people in developing nations new skills that were directly correlated to modernisation of the country. It was a scheme of establishing ideas of economic competition and growth through education. However, the process was largely unsuccessful, and the focus shifted to resource allocation to certain levels of education while still having human capital increase in mind. This led to a conclusion that compulsory primary education was the most significant to country's gross domestic product.[73] Consequently, this led to an increase of international organisation's influence on education in low-income nations, which, in itself, is a new form of academic imperialism. This process created a certain dependence on Western teaching practices and prevented indigenous populations from conducting their own research on educational curriculum and further learning. Wide use of Western textbooks and other learning materials ensured that a certain 'Western-centric' type of education prevailed as a desirable one.[73]

Politics edit

With a surge of new political elites, taking control of the educational structures was seen as an opportunity for political influence and formation of new concepts of sovereignty. In some cases, it meant establishing support for dictatorship or one party rule; in others the creation of a new liberal mindset. The widespread modern view on education was that it necessary for further globalisation, poverty reduction, and economic growth. Despite this, the African illiteracy rate stood at over 80% - twice that of the world average – and only 16 of 13 million Congolese received higher-education – a deliberate ploy by Belgian colonists to incrementally civilise their subjugated population.[73] This last statistic shows how the political ideologies of imperialist states had permeated through into education systems, emphasising the interdisciplinary influence of academic imperialism.

Conclusion edit

Overall, widespread imperial influence on various parts of development of previously colonised African territories has been implemented in multiple ways through education. The 'Western-centric' learning material, emphasis on certain subjects and linguistic tendencies have affected not only the way that children learn, but their perception of self identity, history, valuable skills and through that the economy and politics of the region. If one wants to comprehend the full effect imperialism has on education, the scope of research has to exceed disciplinary boundaries.

References edit

Imperialism: a black and white issue?

A study was carried out to test the hypothesis that the variation in the pigmentation of skin colour in diverse populations is consistently correlated with the mean measured IQs of various groups.[91] The notion that people's cognitive abilities can be ranked on a sort of hierarchical scale seems absurd. The main limitation of such a study design is the reasoning behind the causal basis of the correlation. Does the pigmentation of one's skin actually define intelligence? Or is it simply a social construct, which has fed into the system, enabling the elites to retain their socio-political power? These questions may be something for the 94 percent of (white) politicians sitting in the House of Commons to think about.[92]

Colourism: Imperialist Roots edit

This scientific correlation parallels the ideologies underpinning the Age of Enlightenment in the 18th century: an intellectual and philosophical revolution which identified Europeans with superior intellect and beauty.[35] These principles were later embodied in the work of Immanuel Kant, who claimed that the colour of the individuals’ skin was “clear proof that what he said was stupid” [93](p. 38). According to Western religious beliefs, blackness is associated with sin and whiteness with purity. European religious folk-law overflows with stories of sin turning men black, to stories of black people being born in hell. Thinking about this more profoundly, the ideology of IQ and intellect was almost founded during the Age of Enlightenment. Are these scientific tests based on a concept designed to facilitate European superiority?

The unequivocal link between the psychological damage of the slavery movement and the development of the skin bleaching is highlighted by Deborah Gabriel.[35] The imperialist domination over African nations dehumanised those who were enslaved,[35] thus establishing an exclusive standard of human beings based on Western superiority. Between 1526 and 1867,[71] approximately 12.5 million slaves were shipped from Africa to the West. Gabriel argues that the scars of this "tragic past" [35](p. 101) developed from centuries of being perceived as second-class citizens.[35]

Thinking about colourism further, Gabriel defines the concept as a "system of privilege and discrimination based on the degree of lightness" of skin colour [35](p. 5). Furthermore, Bodenhorn and Ruebeck [94] discuss that colourism developed during the slavery era of America, in reference to the fact that light skinned slaves were disproportionately selected to work as house-slaves, whereas those with relatively darker skin were forced into the fields. Developing further, having a lighter skin tone was regarded as the basis for a better standard of living,[71] which further highlights the exponential imperialist influence. Moreover, centuries of “irreparable cultural damage” [35](p. 96) from enslavement has established the foundations for the phenomena of skin bleaching to this day.

Skin perception in the mass media edit

This issue of colourism is reflected in the media in the forms of advertisement, magazines, movies, television and the internet. Mainstream media plays an important role in the construction of the black image, shaping society's understandings of blackness and beauty, often dissociating the one from the other. For instance, the underrepresentation of dark skin females in advertisements contributes to the promotion of the unhealthy, "racist" idea of "black ugliness" in women" [35](p. 19), indicating the existence of a white supremacy in beauty standards.

As argued by Deborah Gabriel [35] (p. 28): “because white skin is personified as the beauty ideal, lighter skin women are seen as more beautiful than darker skinned women”. Even though we live in a diverse society, popular culture keeps privileging light skinned women over their darker counterpart, as they are closer to whiteness and Eurocentric features. In 2005, four African-American women (Halle Berry, Alicia Keys, Sophie Okonedo and Oprah Winfrey) appeared in People's Magazine list of '50 Most Beautiful People', but all of them except for Winfrey had a lighter complexion, which is a product of their mix race heritage.[95] Furthermore, the glorification of white beauty is clearly visible in the fashion industry, which is dominated by fair skinned models.[35] However, earlier studies have found that even black African-American magazines such as Ebony leave little room for dark black women in their pages.[35] This highlights how the colourist bias is also embedded in the minds of the black community. Given that light skin is a marker of beauty and attractiveness, dark skinned women may suffer from low self-esteem in a world that fails to represent them and that constantly rewards and values whiteness. In fact, researchers found that “a change in skin colour from dark to light is associated with a .28 increment in self-esteem” [96](p. 347). That is to say that colourism actively affects women's perception of their dark skinned self in a negative way. The issue is being brought to the attention of the international audience, thanks to notable celebrities such as actress Nandita Das. In an interview with The Guardian, Kavitha Emmanuel, founder of Indian NGO Women of Worth, explains how the 'Dark is Beautiful' campaign endorsed by the actress in 2009 “is standing up to bias toward lighter skin in India”.[97]

India: the biggest market edit

Figure 1. Skin bleaching cosmetic products. Photo taken in a shop in London on the 1st December 2018.

During the nineteenth century, Great Britain was one of the leading Imperialist countries of the West and had colonies in India since the sixteen hundreds. At the end of the 19th Century, British emigration to India increased exponentially as the British Imperialist Government encouraged the ideological reproduction of the Empire.[98] Nationalist British who moved to India considered themselves to be a superior race with respect to the black Indians. As they were a minority, the British were mainly interested in Indians for their army and workforce, while higher positions were reserved for white people, or in some cases to whiter skinned Indians.[73] The idea of a privileged, lighter skinned ruling class is deeply embedded in Indian culture, such that even after independence in 1947, lighter skin was still considered more desirable.[73] Market size for fairness cosmetics and creams in India is estimated to be approximately US$450 million today and the market growth rate for this cosmetic branch is 20% per annum.[88] According to “a conjoint analysis of consumer preferences”,[88] “it has been estimated that males constitute 20% of the total sales for fairness creams in India” and teens make up the 10% of sales of fairness skin cosmetics: these products have penetrated the Indian society as a whole (p. 13).

Conclusion edit

Skin bleaching products are not only widely diffused in India, but can be found worldwide and are very easily available, as we can see from Figure 1- a photo taken recently in central London. Internationally renowned Western cosmetics giants, such as Garnier, which owns 7% of the total market share,[89] are the main actors behind this obsession with fair complexion that continues to grow exponentially. Ironically, this is not only an issue involving the colonised, yet is prevalent in the heart of post-colonist Britain. As we can see from the previous economical, historical, psychological and sociological analysis, skin bleaching continues to be an urgent and extremely widespread issue: the global skin-lightening industry was worth $4.8bn in 2017.[89]

References edit

  1. Editors,Francisco Franco, History, published in 2009 [accessed on the 19th of November 2018] retrieved from
  2. John Simkin, 1936 Spanish Elections, SpartacusEducational updated in August 2014 [accessed December 3rd]
  3. editors, Spanish Civil War breaks out, History, published in 2010 [accessed on December 7th 2018]
  4. Marta Rioja Barrocal, English-Spanish Translations and Censorship in Spain 1962–1969, inTRAlinea, published in 2002; [accessed on the 29th of November 2018] Retrieved from
  5. Anna Papafragou, Epistemic modality and truth conditions, Science direct, published in 2005, [accessed on the 4th of December]
  6. Ishaan Taroor, Eighty years later, the Nazi war crime in Guernica still matters, the independent, published in 2017 [accessed on the 23rd of November 2018]
  7. Ed, Ten facts about the Bombing of Guernica, History Collection [accessed on the 5th of December 2018]
  8. Editors, Guernica Returned to Spain,History, published in 2010 [accessed on the 30th of November 2018]
  9. Jackie Pike, Piecing together Guernica, BBC News, published in 2009 [accessed on the 6th of December 2018]
  10. « Share America » [accessed on the 28th of December 2018]
  11. Art and Architecture TowardsPolitical Crises: The 1937 Paris International Exposition in Context, culturedarm [accessed on the 17th of November 2018]
  12. Martin Minchom, The truth about Guernica: Picasso and the lying press, The Volunteer, published in 2012 [accessed on the 1st of December 2018]. Retrieved from
  13. Guernica: Testimony of War, pbs, [accessed on the 22nd of November 2018]
  14. Alex Danchev, Picasso's politics , the Guardian, published in may 2009,[accessed on the 29th of November 2018]
  15. Secker and Warbug,Homage to Catalonia, British Library, published in 1938 [accessed on the 25th of November 2018]
  16. Homage to Catalonia, George Orwell, Chapter 7.In, Published in penguin Books in 2013
  17. Marta Rioja Barrocal, English- Spanish Translations and Censorship in Spain 1962–1969, inTRAlinea, published in 2010[accessed on the 1st December 2018]
  18. The critical Heritage, Jeffrey Meyers, B.C.Southam editor, Introduction : Controversy, reviews and reputation, page 14, published in 1975
  19. Alberto Lazaro, The Censorship of George Orwell's Essays in Spain [accessed on the 2nd of December 2018]
  20. Tejvan Pettinger, Orwell's biography, Biography online,published in 2014 [accessed on the 19th of November 2018] socialism-george-orwell
  21. Marta Rioja Barrocal, English-Spanish Translations and Censorship in Spain 1962–1969, inTRAlinea, published in 2010 [accessed on the 1st December 2018]
  22. T. M. GREENE. The Arts and the Art of Criticism, Page 80, 1940, Princeton University Press. London, Access date 10th of November 2018
  23. International Monetary Fund. About the IMF: the IMF at a glance. Available from: [Accessed 5th December 2018].
  24. "Life and debt" Film website. About Life and Debt. Available from: [Accessed 5th December 2018]
  25. Medley JE. The East Asian economic crisis: surging U.S. imperialism?. Review of Radical Political Economics.2000 September 1;32(3):379-387.[13]
  26. Vatikiotis M. Fund under fire. Far Eastern Economic Review. 1998;161(20): 60-62.
  27. a b Tabb WK.The East Asian financial crisis. Monthly review.1998;50(2):24-38.[14]
  28. Katz SS. The Asian crisis, the IMF and the critics. Eastern Economic Journal. 1999;25(4):421-439.[15]
  29. Feldstein M. What the IMF should do. the Wall Street Journal. 1998 October 6
  30. Fischer S. The IMF and the Asian crisis [Lecture] University of California Los Angeles. 20th March 1998.
  31. International Monetary Fund. IMF Members' Quotas and Voting Power, and IMF Board of Governors. Available from: [Accessed: 7th December 2018].
  32. a b Stiglitz J. Globalization and Its DiscontentsNew York: W.W. Norton & Company; 2002.
  33. Kim SW. Covering globalisation: A comparative analysis of news reports about the 1997 Asian economic crisis and the IMF bailout[dissertation on the Internet]. Indiana University; 2001[Cited 8th December 2018] Available from:
  34. Kutan A, Muradoglu G, Sudjana B. IMF programs, financial and real sector performance, and the Asian crisis. Journal of Banking & Finance. 2008;36(1):1-4. Available from:[Accessed 8th December 2018]
  35. a b c d e f g h i j k l m n o p q r s Babb S. The IMF in sociological perspective: A tale of organizational slippage. Studies in Comparative International Development. 2003;38(2):3-27. Invalid <ref> tag; name ":0" defined multiple times with different content
  36. Babb S, Buira A. XVIII G24 Technical group Meeting.Mission Creep, Mission Push and Discretion in Sociological Perspective: The Case of IMF Conditionality, Geneva 2003;1(1):3-27. Available from:[Accessed 8th December 2018]
  37. Eurostat. General government gross debt-annual data. Available from: [Accessed 5th December 2018]
  38. Elliott L, Smith H. IMF 'to admit mistakes' in handling Greek debt crisis and bailout. The Guardian [Internet]. 2013 [cited 5th December 2018] Available from:
  39. Janssen R. Greece and the IMF: who exactly is being saved? Available from: [Assessed 5th December 2018]
  40. Rocholl J, Stahmer A. Where did the Greek bailout money go? ESMT Berlin. Publication number: WP-16-02.
  41. Rasmus J. Greek debt and the new financial imperialism. Available from: [Accessed 5th December 2018].
  42. Ardagna S, Caselli F. The Political Economy of the Greek Debt Crisis: A Tale of Two Bailouts. American Economic Journal: Macroeconomics. 2014;6(4):291-323.
  43. El-Erian M. Greek debt: IMF and EU's quick fix isn't enough | Mohamed El-Erian [Internet]. The Guardian. 2017 [cited 7th December 2018]. Available from:
  44. a b Seitz F, Jost T. The Role of the IMF in the European Debt Crisis. 1st ed. Aschaffenburg: Hochschule Wissenschaften; 2012 Available from: [Accessed 6th December 2018]
  45. Hetzner C, Kyriakidou D. Europe's debt crisis may spread: IMF; Greek collapse? National Post[Internet]. 2011 [cited 7th December 2018]. Available from:
  46. a b Kentikelenis A, Seabrooke L. The Politics of World Polity: Script-writing in International Organizations. American Sociological Review. 2017;82(5):1065-1092.
  47. Kondilis E, Giannakopoulos S, Gavana M, Ierodiakonou I, Waitzkin H, Benos A. Economic Crisis, Restrictive Policies, and the Population’s Health and Health Care: The Greek Case. American Journal of Public Health. 2013;103(6):973-979.
  48. Lauren Downing Peters (2018) Teaching Fashion Studies, Fashion Theory, DOI: 10.1080/1362704X.2018.1518750
  49. Anneke Smelik (2017) Fashion Studies. Research Methods, Sites and Practices, Fashion Theory, 21:5, 617-620, DOI: 10.1080/1362704X.2017.1310436
  50. Amed, I. and Mellery-Pratt, R. (2018). Is Fashion Education Selling a False Dream?. [online] The Business of Fashion. Available at: [Accessed 7 Dec. 2018]
  51. Templeton L. World’s leading fashion schools open in Asia to meet rising demand [Internet]. South China Morning Post. 2018 [cited 7 December 2018]. Available from:
  52. Billet. Donal Trump and the Aesthetics of Fascism [Online newspaper]. The Guardian; 2016 [Cited 2018 Dec 7]. Available from:
  53. Almila A. Fashion, Anti-Fashion, Non-Fashion and Symbolic Capital: The Uses of Dress among Muslim Minorities in Finland. Fashion Theory. 2016;20(1): 81-102. Available from:
  54. Tarlo. Visibly muslim – Fashion, Politics, Faith [eBook]. Berg fashion library 2010 [cited 2018 Dec 7]. Pages 131-160. Available from:
  55. Sherwood. Meet Generation M: the young, affluent Muslims changing the world [online newspaper]. The Guardian 2016 [cited 2018 Dec 7]. Available from:
  56. Mann. The immodest Modest Fashion controversy in France [blog post]. Carol Mann 2016 [cited 2018 Dec 7]. Available from:
  57. Ideas and ideologies in twentieth century Latin America. Cambridge: Cambridge University Press; 1998.
  58. Crane D, Bovone L. Approaches to Material Culture: The Sociology of Fashion and Clothing. Poetics [Internet]. 2006 [cited 2 December 2018];34(6):319-333.
  59. Calefato P. Fashion as Cultural Translation: Knowledge, Constrictions and Transgressions on/of the Female Body. Social Semiotics [Internet]. 2010 [cited 3 December 2018];20(4):343-355.
  60. The Fashion Law. Urban Outfitters Wins Latest Round in Navajo Nation Case [Internet]. The Fashion Law. 2018 [cited 5 December 2018]. Available from:
  61. The Value of Fast Fashion: Quick Response, Enhanced Design, and Strategic Consumer Behavior. (2011). Management Science, 57(4), 778-795.
  62. Hirano K, Chen Y. The State and cultural transformation. Tokyo: United Nations University Press; 1993.
  63. Joy A, Sherry J, Wang J, Chan R. Fast Fashion, Sustainability, and the Ethical Appeal of Luxury Brands. Fashion Theory. 2015;16(3): 273-295. Available from :
  64. Cachon G, Swinney R. The Value of Fast Fashion: Quick Response, Enhanced Design, and Strategic Consumer Behavior. Management Science. 2011;57(4): 778-795. Available from:
  65. Austruy J. 'Néo-colonialisme' in Encyclopædia Universalis, 2018. Available from:
  66. de Rocquigny T, Minvielle G, Mouhoud E. Au fil de l'éco (3/4). L'odyssée mondiale du vêtement. [podcast] 2018. Available from: [Accessed 5 Dec. 2018].
  67. Dhaka collapse toll passes 1,000. BBC News. 2013 [cited 5 December 2018]. Available from:
  68. Hobson J. To die for? The health and safety of fast fashion. Occupational Medicine. 2013;63(5), 317–319. Available from:
  69. Nielsen M, Haun D, Kärtner J, Legare C. The persistent sampling bias in developmental psychology: A call to action. Journal of Experimental Child Psychology. 2017;162:31-38.4
  70. Rad M, Martingano A, Ginges J. Toward a psychology of Homo sapiens: Making psychological science more representative of the human population. Proceedings of the National Academy of Sciences. 2018;115(45):11401-11405.
  71. a b c d e f Henrich J, Heine S, Norenzayan A. The weirdest people in the world?. Behavioral and Brain Sciences. 2010;33(2-3). Invalid <ref> tag; name ":1" defined multiple times with different content
  72. Sanz N, Bergan S, Council of Europe, Eurimages (Organization). The heritage of European universities. 2nd ed. Strasbourg: Council of Europe; 2006.
  73. a b c d e f g h i Rudy W. The universities of Europe, 1100–1914: a history. London: Fairleigh Dickinson University Press; 1984. Invalid <ref> tag; name ":2" defined multiple times with different content
  74. Benjamin A. ELMAN. Civil Service Examinations (keju), The University of Princeton, 2009.
  76. Roselli C. Neurobiology of gender identity and sexual orientation. Journal of Neuroendocrinology. 2018;30(7):e12562.
  77. Berenbaum S, Hines M. Early Androgens Are Related to Childhood Sex-Typed Toy Preferences. Psychological Science. 1992;3(3):203-206.
  78. Backhouse, R., & Fontaine, P. (2010). The History of the Social Sciences since 1945. Cambridge: Cambridge University Press. doi:10.1017/CBO9780511845260
  79. Marie E. Bathiche, Children's game and toy preferences: A contemporary analysis McGill University, 1993
  80. Boe, J.L. & Woods, R.J. Sex Roles (2018) 79: 358.
  81. Joshua Leroy Boe, PARENTS’ IMPACT ON PREGENDER CHILDREN’S TOY PREFERENCES, North Dakota Sate University, 2014
  82. Marie E. Bathiche, Children's game and toy preferences: A contemporary analysis, McGill University, 1993
  83. Todd, B., Barry, J., & Thommessen, S. (2016). Preferences for ‘Gender-typed’ Toys in Boys and Girls Aged 9 to 32 Months Infant and Child Development
  84. Spinner, L., Cameron, L. & Calogero, R. Sex Roles (2018) 79: 314.
  85. Barry J. Zimmerman and Richard Koussa, Contemporary Educational Psychology Volume 4, Issue 1, January 1979, Pages 55-66
  86. Harlow, Barbara. Volume Introduction: The Scramble for Africa. In: Harlow, Barbara, Carter, Mia (eds.) The Scramble for Africa. Durham, NC: Duke University Press; 2004. p. 1-9.
  87. a b c d e Omolewa, Michael. Traditional African Modes of Education: Their Relevance in the Modern World. International Review of Education. 2007;53(5/6): 593-612. Available from: [Accessed 25th November 2018].
  88. a b c d e f g h i j Rodney, Walter. How Europe Underdeveloped Africa. United Kingdom: Black Classic Press; 1972. Available from: [Accessed 4th December 2018]. Invalid <ref> tag; name ":4" defined multiple times with different content
  89. a b c d e Whitehead, Clive. British Colonial Education Policy: a Synonym for Cultural Imperialism? In: Mangan, J.A. (ed.) Benefits Bestowed? Education and British Imperialism. London: Routledge; 1988. p. 211-230. Invalid <ref> tag; name ":3" defined multiple times with different content
  90. a b Phillipson, Robert. Linguistic imperialism: African perspectives. ELT Journal. 1996; 50(2): 160-167. Available from: [Accessed 1st December 2018].
  91. Jensen AR. Comments on correlations of IQ with skin color and geographic–demographic variables, Intelligence. 2006;34(2): 128-131. Available from: [Accessed 30th November 2018].
  92. Wood J, Cracknell R. Social and General Statistics Section. Ethnic Minorities in Politics, Government and Public Life. 2013: 1-10. Available from: [Accessed 28th November 2018].
  93. Kant I. On the Different Races of Man. In: Eze EC. (ed.) Race and the Enlightenment: A Reader. (38-64). Cambridge, Mass: Blackwell; 1997.
  94. Bodenhorn, H. & Ruebeck, C.S. J Popul Econ. Colourism and African–american wealth: evidence from the nineteenth-century south. 2007; 20: 599-620. Available from:
  95. Harrison, Matthew S. Racism in the 21st Century: An Empirical Analysis of Skin Color. New York: Springer-Verlag New York Inc; 2010: p.52.
  96. Maxine S. Thompson, Verna M. Keith. The blacker the berry: Gender, Skin tone, Self-Esteem and Self-Efficacy. Gender and Society. 2001; 15(3): 336-357 .
  97. Abraham MR. "Dark is beautiful: the battle to end the world's obsession with lighter skin". Available from: [Accessed 6th December 2018].
  98. Kaul C. From Empire to Independence: The British Raj in India 1858–1947, The Government. Available from: [Accessed 4th December 2018]

Imperialism in whitening of 21st century Brazil

Imperialism is the “policy of extending a state’s influence over other peoples or territories”[1]. These expansions have historically been conducted by westerners who exert their power over others and establish their own ideals in society. This chapter will analyse to what extent imperialism has resulted in the promotion of ‘whitening’ in Brazil. This claim will be evaluated using an interdisciplinary approach, focusing on the socio-economic hierarchies, immigration policies and the dominance of the media. We will argue that the promotion of whitening in Brazil is due to their imperial legacy.

Literary review edit

The painting 'A Redenção de Cam' by Modesto Brocos, 1895 displays a black grandmother, her mulatto daughter, her white husband and their white child. The grandmother is lifting her hands to the sky thanking God that her grandchild is white. The painting illustrates the notion that through 2-3 generations of interracial reproduction, black characteristics would vanish.

'Whitening’ can be defined as “the act or process of becoming white”[2]. The concept can be divided into two main categories: biological and symbolic. Symbolical whitening describes the “ideology that emerged from the legacy of European colonialism in Latin America that catered to white dominance”[3]. Biological whitening refers to racial whitening through interracial marriage.

'Whitening' of race, society and ideals in Brazil have roots in the country's imperial past. When slavery was abolished in 1888, Brazil had the greatest population of African descent excluding Nigeria.[4] The Brazilian elites viewed this as a problem as they believed the presence of the ‘inferior’ race would restrict the country’s development. Biological whitening was considered the solution to the ‘Negro problem’[5] through generational, interracial reproduction. This notion was legitimised by scientific racism, which claimed that the ‘Caucasian' race was genetically and culturally superior to the ‘Negro’ race, and Darwin's theory of natural selection and the ‘survival of the fittest’[5].

In general, Brazilians regard white as superior[5]. This symbolic whitening can be described by the concept of colourism, which is “the process of discrimination that privileges light-skinned people of color over their dark-skinned counterparts”[6]. Margaret Hunter claims that the reason many are unaware of their ‘white’ preference is "because that dominant aesthetic is so deeply ingrained in our culture”[6]. Latin Americans have thus learned to incorporate these imperial values into their society and glorify European features and light skin tones[6].

Racial hierarchies and socio-economic class edit

Imperial prejudices imported from Europe have established racial hierarchies within Brazil[7], which in the 21st century have had significant socio-economic implications. During the Empire, economic centres developed where white landowners began mass production and exportation of commodities. While some regions grew exponentially, others (populated predominantly by native and black Brazilians) remained in poverty, which has led to vast regional divisions between the ‘progressive’ south and ‘undeveloped’ north. Economic growth was concentrated in white areas because Europeans had the means to accumulate wealth, owning land, slaves and capital, whereas black and native Brazilians did not, due to their historical racial status.[7] This has caused the whitening 'of economically developed areas of Brazil and intensified regional divisions of race since the end of the Empire. A 2015 study on the ancestral distribution of the population supports this, showing that European ancestry is highest in the South (77%) and African is highest in the Northeast (27%)[8]. This strengthens the argument that economic and regional inequalities correlate to race in Brazil and have contributed to the whitening of its cities in the South. Although an explicit relationship of causation cannot be established between these modern economic hierarchies and imperialism, the influence of the racism that segregated the non-white population of Brazil during the empire is mirrored in current regional divisions.

Data showing the distribution of genetic ancestry throughout the regions of Brazil.[8]
Region European African
North 51% 16%
Northeast 58% 27%
Central-West 64% 24%
Southeast 67% 23%
South 77% 12%

Immigration policies edit

Racial hierarchies established during the Empire have also biologically whitened the population of Brazil through miscegenation[9], enabled by European immigration. Selective immigration policies began in 1890, at the end of imperialism, clearly prohibiting entry of “black and yellows”, preferring white labour to supplement agriculture[5] . Similar discriminatory policies continued into the 20th century, such as nationality quotas set in 1945 with a clear preference for Europeans[9], illustrating the endurance of racial hierarchies established during the Empire. The growth of the white population accommodated the elites’ desire for an overall whiter demographic[10]. The 'whitening' process was furthered through encouraged miscegenation[9], believed to have the ability to eradicate ‘blackness’ based on theories of Social Darwinism. Brazil’s immigration policy has greatly contributed to the success of their aims of 'whitening', demonstrated in 1872–2010 census data, showing that between those years the white population increased from 38.14% to 47.73%, while the black population decreased from 19.68% to 7.61%. However, the white population reached its peak in 1940, at 63.47% of the country, before steadily declining[3] . This suggests, therefore, that biological 'whitening' has occurred in Brazil as a result of imperial racism, seen in the demographics of the 21st century population. However, this has not occurred as widely in the 21st century as it did in the early 20th, due to the end of mass European immigration.

Depiction of the white ideal in Brazilian media edit

'Isaura the Slave' 2004 telenovela remake

The process of 'whitening', initiated by Imperialism, is currently promoted in Brazil by the media. Studies show that 122 million Brazilian people are active on social media[11] and that 81% watch television as their main source of leisure[3], revealing the influencing power of media on the population. Bill boards depict the ‘good life’, linking their campaigns with the elite in Brazil, commonly Caucasian, suggesting that a lighter appearance opens the pathway to socio-economic progress[12]. This advertising of the ‘white ideal’ reveals an immobilised colonial mentality, persuading Brazilians to invest in their appearance in order to reach these imperial beauty ideals, seen to reward them with monetary success or societal acceptance[13]. The ‘Brazilian Blow Dry’ has been a huge success around the world, developed as a chemical process allowing black woman to straighten their hair permanently to adhere to the white ideal[12]. Furthermore, Brazil currently has the largest black population outside of Africa, yet most famous models, regularly shown in Brazilian advertisements, Giselle, Alessandra Ambrosio, Adriana Lima, are all white. Brazilian news and entertainment industries also fail to represent the black population. Prone to ‘white-washing’, telenovelas have cast white leads in roles of originally black characters of books, such as in ‘Isaura the slave’. The depiction of interracial couples in television has also implicitly contributed to the current biological whitening of Brazil [14].This representation of what a successful partnership looks like has increased the desire for interracial relationships, whitening the population through miscegenation.

Recently there has been a rise in the number of cases of online racism towards black women[15]. Facebook and Twitter have become modern-day platforms for anonymous racism, many relating to imperial racism. In 2017 there were 63,698 reported cases of cyber hate comments, a third of which were racists comments towards black Brazilians[15]. In order to avoid this, more black people try to 'whiten' their appearance symbolically, or the appearance of their children biologically. Therefore, ideas developed through imperialism are still present in 21st century Brazilian society, through their representation and reproduction in the media.

Conclusion edit

Imperialism, by creating racial and socio-economic hierarchies, enhanced by immigration policies, has supported the 'whitening' ideology in Brazil. However in the 21st century the media plays the most significant role in the promotion of 'whitening'. Globalisation and capitalism can be seen as a modern form of imperialism, suggesting this is not an issue restricted to Brazil but rather a global phenomena. In the case of Brazil it is evident that Imperialism has ingrained racial disparities into society and continues to influence the promotion of 'whitening' to this day.

References edit

Imperialism and educational projects

Imperialism has left its lasting impression on many of our systems in society today, with roots bound in the imposed colonisation from the West onto underdeveloped countries[16]. By discussing neo-colonialism the discourse of imperialism distances itself from purely a historical sense, and by specifically delving into academic imperialism we see examples of contemporary colonialism in educational schemes from the United Nations (UN) in developing worlds. The positive and negative perspectives of these schemes will be examined, through an analysis of scholars’ thoughts and the motivations behind the schemes. Specifically discussing educational development projects which follow a Western academic structure, revealing that they are a form of academic imperialism.

Arguably, the dismantling of European empires after WW2 marked the end of imperialism[5](p.188), yet certain scholars, including Furedi, argue that though this may be a post-colonial era, it is not post-imperial as imperialism is not necessarily a formal construct.[5](p.189) More recently, scholars have defined imperialism as a broader concept; Galtung describes imperialism as a specific relation between a Centre and a Periphery, and between their respective centres and peripheries.[6](p.48) Such definitions will allow the identification of imperialism in the modern world.

Different arguments regarding the existence and cause for educational imperialism predominantly lie in two conflicting views for the causes of imperialism. Schumpeter argues that imperialism is the result of man’s psychological, irrational and instinctual inclination to war. He believes capitalism counteracts imperialism, as it provides an outlet through healthy competition, eliminating economic reasons for conquest, since free trade allows all nations to have equal terms in the global market.[6](p.34-6) Hence, the spread of schooling systems encouraging capitalism (Western systems) diminishes the incidence of imperialism, agreeing with neoclassical development theory that schooling is a liberating element.[6](p.42)

Lenin conversely, claims that imperialism is an inevitable development of capitalism as monopolistic conditions replace competitive ones, forcing monopolies and oligopolies to expand to survive.[6](p.39-40) Therefore, global education policies from the West, which promote capitalism, allow countries to be controlled by advanced countries’ monopolies, resulting in imperialism and exploitation.[6](p.43) This echoes neo-Marxists and dependency theorists, including Andre Gunder Frank, who argue that certain countries have been made dependent by capitalism, specifically the IMF and World Bank, who have restructured the world economy for the continued flow of wealth to the West.[5](p.188-9) In academic teaching, any positive effects are felt solely by the powerful core, whilst education for the periphery is used by the core to perpetuate the current hierarchy, thus allowing for educational imperialism to occur.[6](p.57)

In 2015 the UN set the Sustainable Development Goals (SDGs), scaled over a number of underdeveloped areas, as a ‘blueprint to achieve a better and more sustainable future’.[17] One of these (SDG4) focuses on education in third world countries to, according to the UN ensure all learners acquire the knowledge and skills needed to promote sustainable development through a culture of peace and non-violence, global citizenship and appreciating cultural diversity.[18] Generally, the UN’s actions are not questioned by society. However, it is important to reflect on whether this educational intervention is an effective platform for genuinely improving education or whether it promotes a form of neo-colonialism. Could the UN’s goal for intervention be to further the control of its leading countries? Or trying to better education globally, by teaching students the importance of the intersection between economics, social justice, and environmental sustainability to foster a better understanding and implementation of sustainable development?[19]

In critically analysing the intervention of NGOs such as the UN in the developing world it cannot be denied that the roots of these goals are firmly planted in the values held by Western Europe and the US. The emphasis in SDG4 is upon the term ‘global citizenship’ which can broadly be interpreted as globalisation, a more socially acceptable term used in popular discourse to infiltrate the culture and infrastructure of a less powerful country by breaking down resistance to foreign transnational corporations.[20] Thus it could easily be suggested that the UN’s educational goals are a form of neo-colonialism imposing their own intellectual value systems and ‘capital accumulation through academic means’.[3](p.1)

Further to this, the UN, a colonial creation, is also emphasising the term sustainable development. According to Sumner[3], this is defined as development to meet the needs of the present without compromising future generations, however, in doing so we are effectively ‘supporting the competitive forms of human engagement that build the wealth of the private elite’(p.12). Consequently, every aspect of the education system in highly developed countries will simply be rolled out to less developed countries hidden behind the pretext of ‘sustainable development’ and thus perpetuating the extreme elitism in Western academia. This was observed with the preceding Millennium Development Goals which slowed progress, due to a heightened emphasis on targets and national averages.[21] The diversity in education we appreciate today will be rendered into academic homogeneity as every country becomes a ‘blueprint’ of a western superpower.

Despite the indication that the SDGs are perhaps the interference of a colonialist entity, it can also be argued that academic imperialism is not always detrimental to the countries involved. If a purely theoretical economic lens is adopted, then Western education encouraging competitive capitalist thinking, thus promoting these types of markets, is actually beneficial. Competitive markets maximise welfare (the sum of consumer and producer surplus) and produce no deadweight loss.[9](p.382) Indeed, one of the reasons for the study of these markets is that they are viewed as the standard against which other markets should be compared.[9](p.251) Hence, an interdisciplinary conflict can be observed, if, instead of viewing the issue from a sociological perspective, it is viewed using only economic theory. Furthermore, as Rothkopf reflects, globalisation works towards cohesion by removing cultural barriers and taking a vital step towards a more stable world with better lives for its citizens. Rothkopf heralds cultural imperialism as a marker of the progress of civilisation.[22] Thus parallels can be drawn with academic imperialism and in this case the UN’s education goals, perhaps this academic imperialism is a step towards enhanced understanding and communication between intellectual communities.

The UN’s SDGs for education are glorified in society as the selfless improvement of education in the developing world. However, as we have discussed it is not so simple; many of the methods and approaches are not in line with the genuine need and serve mainly the imposing party. Undeniably, these goals promote a positive agenda of equal opportunities and removing inequalities[23] but that’s not to say that academic imperialism cannot become a by-product of these seemingly innocent educational schemes, producing adverse effects equal to or outweighing the claimed benefits. Truthfully only time can tell what impact these educational schemes have on a countries future intellectual independence and growth, but we can conclude that these schemes exhibit academic imperialism to varying extents. Understanding the inherent consequences of these interventions is essential for developing positive future projects for the furthering of global education by taking a more interdisciplinary, holistic approach which accounts for the culture and values of the society which is being developed, for example by involving ethnographic research to create a curriculum suited to the region being taught.

References edit

  1. Collins English Dictionary (2018) "Imperialism". HarperCollins Publishers.
  2. Merriam Webster (2018) “Whitening”. Merriam Webster Online Dictionary.
  3. a b c d e Akande, H. (2016) “Illuminating the blackness: Blacks and African Muslims in Brazil”, Rabaah Publishers, London. Invalid <ref> tag; name ":2" defined multiple times with different content
  4. Araujo, A. (2015), "African Heritage and Memories of Slavery in Brazil and the South Atlantic World.", Cambria Press.
  5. a b c d e f g Skidmore, T. E. (1993), "Black into White: Race and Nationality in Brazilian Thought", Duke University Press, Durham and London, pp. 38-48. Invalid <ref> tag; name ":0" defined multiple times with different content
  6. a b c d e f g h i Hunter, M. (2007), "The Persistent Problem of Colorism", Sociology Compass. Vol.1(1), pp.237-254. Invalid <ref> tag; name ":1" defined multiple times with different content
  7. a b Weinstein, B. (2015), "The Colour of Modernity: Sao Paulo and the Making of Race and Nation in Brazil", Duke University Press.
  8. a b Rodrigues de Moura, R., Coelho, A. V. C., de Queiroz Balbino, V., Crovella, S., Cavalcanti Brandão, L. A. (2015), "Meta‐analysis of Brazilian genetic admixture and comparison with other Latin America countries", American Journal of Human Biology, Vol 27(5), pp. 674–680.
  9. a b c d e Fitzgerald D. S., Cook-Martin, D. (2014), "Culling the Masses: The Democratic Origins of Racist Immigration Policy in the Americas", Duke University Press. Invalid <ref> tag; name ":3" defined multiple times with different content
  10. Dos Santos, S. A. (2002), "Historical Roots of the "Whitening" of Brazil", Latin American Perspectives, Vol.29(1), pp.61-82.
  11. Carro, R. (2017), "Urban Brazil: Digital News Report", University of Oxford.
  12. a b Beserra, B. (2011), "Cultural Imperialism and the Transformation of Race Relations in Brazil", Latin American Perspectives, Vol.38(3), pp.194-208.
  13. Skidmore, T. E. (1995), "Fact and Myth: Discovering a racial problem in Brazil", The Helen Kellogg Institute for International Studies.
  14. Guaraná, B. (2018), "Taís Araújo: The Black Helena against Brazil's Whitening Television", Black Camera, Vol.10(1), Indiana University Press, pp.42-66.
  15. a b Trindade, L.V.P. (2018), "Brazils supposed ‘racial democracy’ has a dire problem with online racism", The Conversation, University of Southampton.
  16. United Nations. Country classification. Available from: [Accessed: 2018, December 9].
  17. United Nations. The United Nations Sustainable Development Goals. Available from: [Accessed 17th November 2018].
  18. United Nations. The United Nations Sustainable Development Goal 4. Available from: [Accessed 17th November 2018].
  19. United Nations. Education for All Agenda. Available from: [Accessed 18th November 2018].
  20. Laxer G. Radical transformative nationalisms confront the U.S. empire. Current Sociology. 2003;51(2): 133-152. Available from: doi:10.1177/00113921030512006.
  21. Anon. Progress for Children: beyond averages learning from the MDGs. United Nations International Children’s Emergency Fund (UNICEF). Report number: 11, 2015.
  22. Rothkopf D. In Praise of Cultural Imperialism?. Foreign Policy. 1997;(107): 38-53. Available from: doi:10.2307/1149331.
  23. Sustainable Development Goals Fund. Goal 4: Quality education. Available from:[Accessed 18th November 2018].

Imperialism in Urban Planning

Introduction to Imperialism and Urban Planning edit

In 2007, the the world’s urban population surpassed 50% and has since been increasing steadily (1). Many cities facing this influx of people lack the infrastructure, housing, and governance necessary to safely accommodate their populations (2). Historically urban planning has often been utilised by political authorities to consolidate their Power, particularly during colonial and post-colonial periods, where an inherent power relationship is evident (3). Dominant urban planning theories around those times were largely based on European models, which still have profound consequences for present-day cities and the overall development of their economies, societies and politics (4). It is therefore crucial to study the imperialistic nature inside the field of urban planning in order to decolonise relevant fields more thoroughly.

Origins of Imperialism in Urban Planning edit

While archaeologists largely agree that The world's first city was settled ca. 4500 BCE (5), the oldest known urban planner was Hippodamus, credited with first writing about zoning and the Gridiron plan which would become widespread throughout the world (6). However, even before this remnants of cities showing examples of grid-like design have been found dating back to 2500 BCE, such as Mohenjo-daro in the Indus Valley (7).

Greek ideas of city planning were spread through their colonisation of the Mediterranean and were further built upon by the Romans where imperialism shaped their urban planning in many ways, from the numerous viae built to support troop movements and strengthen the empire, to the absence of walls in new cities due to the widespread Pax Romana (8).

European Colonisation edit

As European powers colonised the globe, they used urban planning and architecture to extend imperial rule, building tall churches and lavish buildings to demonstrate the power of the invading state. In Latin America, Spanish colonisers redesigned hundreds of preexisting Native American settlements turning them into Spanish cities with a central Plaza surrounded by a huge Church, administration building and the homes of the wealthy elite (9).

Rio de Janeiro edit

The “January River” was first encountered by Europeans in January 1502 and Rio itself was founded 1565. The region had prior been inhabited by the Tupi, Puri, Botocudo and Maxakali peoples. When gold and diamonds were found nearby in the late-1600s, Rio became Brazil’s primary port. In 1763 the colonial administration was moved to Rio and in 1808 it became the capital of the Portuguese Empire as the Portuguese royal family fled Napoleon’s invasion. While the developed centre of city housed the Church and some nobles, poor and indigenous settlements were known to sprawl out over the neighbouring hillsides (9).

Addis Ababa edit

Though Ethiopia is often considered to never have been colonised, it was occupied by Italy running up to and during the 2nd world war. At the end of the 1930s planners attempted to move Addis Ababa's centre away from the palace in the city’s then centre (10). Though upon coming to power, Emperor Haile Selassie attempted to shift the urban centre north but ended up following Italian planning principles and moved the urban centre back to where the Italians had initially planned (10).

Johannesburg edit

Colonial powers didn’t just take over preexisting cities but also built cities in areas of interest such as near mineral reserves and on sea fronts. Johannesburg was originally formed next to the Witwatersrand Gold reef to further South Africa’s extractive mining economy (9). While the city developed in a fairly unplanned fashion, the "Whites'" continual appropriation and division of land priorly owned by African and other coloured peoples, prevented these people from participating in the economic markets and helped colonial powers to oppress them (11).

The Modern City edit

Such imperialism can be seen to have extended beyond the reach of traditional militaristic imperialism, and was seen not only to affect the subjects of the empire but the very core of the empire itself, such as the Haussmann's renovation of Paris where large swathes of Paris were demolished to build new wide boulevards. This set the precedent for the modern city in the 19th and 20th century with many following this example.

Cairo edit

Cairo’s initial settlements stem back to the Persian fort of the Heliopolite Nome. The oldest structure in the city today is the Roman fortress of Babylon since which the city has been ruled by different Muslim caliphates and the Ottoman Empire (12). While Napoleonic French forces only held the city for 3 years, in the mid-1800s Isma'il Pasha redesigned Downtown Cairo to look like Paris (12) as at the time, it was believed that modernity was synonymous with European-ness (13).

Colonial Japan and Seoul edit

Japan has a unique role in strengthening the artificial link between modernity and progress in urban planning. After the Iwakura Mission, the Meiji government initiated a series of plans to modernise Tokyo, such as the Ginza Bricktown project. With consultancy from European urban planners, the government implemented elements considered as modern, including 'the Western architecture, sidewalks, sewers, roadside trees, and paved streets of the modern cityscape' (14, p.511). The Japanese colonial government later imposed a similar modernisation on Seoul partly to demonstrate their power and legitimise their hegemony over Korea by contrasting a Japanese modern city and a Korean primitive one (14).

Lasting Effects of Imperialism in Urban Planning edit

Urban Planning edit

Rapid urbanisation often puts massive pressure on infrastructure, particularly in the less developed and less “cared for” parts of the city. The favela in cities such as Rio dated back to the original disorderly hillside settlements. While the city centre was supported and developed by colonial powers as the centre for economic growth, migration into the outskirts of the cities caused favelas to grow uncontrollably (9). In the 20th century Latin America saw a faster rate of urbanisation than anywhere else on the planet further deepening spatial segregation (9).

Social/Economic edit

While large-scale capitalist urban industrial development often blurs boundaries between social classes, this effect can be hampered in some cities by spatial segregation dating from colonial rule. Across South Africa, apartheid and the focusing of development in traditionally “White” areas of cities has largely prevented “coloured” peoples from accessing tertiary sector employment (9). In Rio, favela residents have limited political representation affecting their infrastructure and economic development, thus inducing cycles of poverty (11).

Problematic Concept of Modernity edit

The idea of modernity has largely developed since the 17th century, and its core philosophy is about breaking from the past and traditional European culture. This 'emergence of certain historically specific social formations' (15, p.14) is closely tied to another concept: progress. Such a connection is particularly problematic when coming to fields like urban planning, because it celebrates only specific ways of development while disregarding others. During the process of modernisation, varying degree of cultural destruction therefore becomes inevitable (16).

In Academics edit

In the mid-20th Century notable sociologists Michel Foucault and Henri Lefebvre looked into the ways in which cities and space impact the individual. This has lead to more focussed research into the symbolic and nuanced psychological effects of colonial city planning and architecture, and efforts to decolonize cities.

References edit

  1. United Nations Population Division. Urban population (% of total). The World Bank Data. Available from: [Accessed 1st December 2018].
  2. Özden K, Enwere C. Urbanization and its Political Challenges in Developing Countries. Eurasian Journal of Business and Economics. 2012;5(10): 99-120. Available from: [Accessed 29th November 2018].
  3. Foucault M, Crampton J W ed., Elden S, ed. Space, Knowledge and Power: Foucault and Geography. Aldershot:Ashgate; 2007.
  4. Baruah N G, Henderson J V, Peng C. Colonial legacies: Shaping African Cities. London: London School of Economics;2017. Available from:
  5. Mark J J. The Ancient City. The Ancient History Encyclopedia. Available from: [Accessed 9th December 2018].
  6. Owens E J. The City in the Greek and Roman World. Rutledge:New York; 1992.
  7. McIntosh J. The Ancient Indus Valley: New Perspectives. ABC-CLIO; 2008. Available from: [Accessed 9th December 2018].
  8. Cilliers, L.; Retief, F.P. City planning in Graeco-Roman times with emphasis on health facilities. Akroterion; 2006. Available from: [Accessed 9th December 2018].
  9. Socolow S M, Johnson L L. Urbanization in Colonial Latin America. Journal of Urban History. 2018;8(1):27-59. Available from:
  10. Rifkind D, Silva C N ed. Urban Planning in Sub-Saharan Africa. New York: Routledge; 2015. p. 145-164.
  11. Davies R J. The spatial formation of the South African City. GeoJournal. 1981;2(2):59-72. Available from:
  12. Raymond A, Wood W ed. Cairo. Cambridge: Harvard University Press; 2000. p. 328.
  13. Elsheshtawy Y. Revolutionary Cairo and Urban Modernity: Lessons from the Sixties. 15th International Planning History Society Conference. Sao Paulo. 2012. Available from: [Accessed 28th November 2018].
  14. Tristan R G. Paving Power: Western Urban Planning and Imperial Space from the Streets of Meiji Tokyo to Colonial Seoul. Journal of Urban History. 2016;Vol.42(3):506–556. Available from:
  15. Robinson J. Ordinary Cities: Between Modernity and Development. New York: Routledge; 2006. Available from:
  16. Mignolo D W. Colonially: The Darker Side of Western Modernity. Durham: Duke University Press; 2011. Available from:

Imperialism and Humanitarian Architecture

Literature review edit

Various aspects of imperialism have been defined throughout the past century. Lenin wrote that imperialism is “the development and direct continuation of the fundamental characteristics of capitalism in general”[1] and more specifically “the monopoly stage of capitalism." Kautsky also argues that imperialism results from advanced capitalism[2]. Robert Young suggests that it is a concept or an ideology rather than an actual practice[3]. According to author Esther Charlesworth, humanitarian architecture aims at helping vulnerable regions and communities[4]. But it can be harmful. W. Easterly wrote that a lack of understanding of the populations’ needs and the wish to supply as much as possible in a limited amount of time result in failed projects[5]. The rise of humanitarian architecture through volunteer tourism agencies generates small unhealthy interventions led by inexperienced people. S. Nutt has explained how the idea that western countries should and can help poorer regions is a vestige of colonialism[6]. Moreover, Bussell and Forbes have underlined that volunteer-tourists are driven by egoistic thoughts although they might have the feeling of being altruistic[7].

Introduction edit

In 2014, a clinic in Turkana, Kenya, was built by a group of MIT students as part of a humanitarian architecture project[8]. Such projects aim at helping communities in need and vulnerable regions through the construction of buildings or diverse facilities. It is, however, striking to observe how many of them do not succeed. In the studied case, the volunteer students failed in various aspects and did not manage to respond efficiently to the local emergency[9]. This kind of failure is a result of westernized perceptions, which raises the question to know how a humanitarian project can reveal itself to be an imperialistic legacy.

The pretext of altruism edit

The pretext of altruism in some humanitarian projects is an imperialistic legacy. Volunteering for instance often uses false pretences to conciliate personal benefits and altruism. The motivations of the workers can revolve around selfish reasons, like looking for other's gratifications or personal benefits [10] . The case study chosen is an example of humanitarian work done for the benefit of the students themselves, as the initial goal was to develop the student's skills, and not to solely help the pastoral population of Turkana. As Raymond and Hall [11]state, there is a difference between volunteer tourism, and using developing countries as 'training grounds' for students. This mirrors when Western powers colonized Africa for personal purposes (like exploiting resources) – pretexting to help locals [12].

The clinic built by MIT students contributes to a new form of colonialism called "educational neo-colonialism". This new wave of action is a legacy of imperialism, using the so-called "inferiority" of a population in need to impose westernized help [13], without proper research of the population's actual needs. This type of volunteerism focuses on "the other" and reinforces conceived ideas, rather than questioning them, to legitimize Western actions[11]. The clinic was built for educational purposes while mirroring imperialistic mindsets.

Disregard towards Turkanas edit

While intervening in foreign countries for personal purposes, international agencies often disregard locals. Hugo Slim says local people should be considered "as human subjects and not the humanitarian objects of others"[14]. This isn't respected by the clinic project as its aim is educational and not humanitarian. This project, thus, seems to use Turkanas' health need as a pretext for a personal project, but this isn't consequence-free. For instance, Samantha Nutt evokes the possible psychological consequences on a child of such projects, like a hyper-affective syndrome[15].

A nutritional report identifies health issues in the region as a cause of the rising mortality [16], but the project lacks anthropologic and geographic research to really consider local health as its motivation. For example, the students had to change the planned architectural techniques and materials because, once there, they discovered that these weren't adapted to the local resources and the Turkanas' traditional nomadic life [17]. The modern design of the clinic is a form of cultural imposition as it strongly contrasts with the culture and the natural surroundings [18]. Previous humanitarian projects also didn't conduct enough research about Turkanas before launching their projects. In 1971, Norway wanted to develop fishing in Turkana, which failed as fishing is not part of Turkanas' culture [19]. This lack of research is a sort of humanitarian disdain or dominance, and thus as a legacy of imperialism.

Temporality aspects edit

Architectural humanitarian projects often focus on short-term benefits, without providing a learning opportunity to the local community. This project was designed to benefit the students while testing their technical capabilities [9]. The lack of inclusion of locals highlights the imperialistic aspect of this project by disregarding the local know-how. Thus, demonstrating the occurrence of knowledge and architectural imperialism.

The total budget of the project was $60'000, which one third was used for MIT's team travel and accommodation. This starkly contrasts with the local economy: Turkana is considered to be the poorest county of Kenya[20], and Kenya has 46% of its population living below the poverty line[21]. This sort of budget raises questions about the ethics of such monetary discrepancies: as how could such a structure be reproduced in the future by the local community.

Sustainable urban development is increasingly important within humanitarian architecture. Renzo Piano's architectural firm called RPBW is working on the construction of a children's surgery centre to help the NGO EMERGENCY and to promote healthcare in Uganda. The project will provide training grounds for the local medical workforce and highly qualified international staff, with responsibilities transferring fully later to the local staff [22]. The roof will be made of photovoltaic panels – allowing the clinic autonomy in terms of day energy supplies. The project also challenges materialistic stereotypes; the architects have decided to use earth as the main material – it is widely available there [23]. Hence, challenging the rudimentary perception of it as well as using a material suited to the region.

Disciplinary tensions edit

The clinic in Turkana is a praiseworthy project. Even if humanitarianism wasn't its main aim, the project used education to carry out aid, which can be argued, is better than doing nothing. The project is at the frontier between two disciplines: educational architecture and humanitarianism. The issues making this project post-imperialistic are due to where the boundary between these two disciplines lies. To complete a good humanitarian project, less importance should have been given to educational and architectural aspects, and more importance should have been given to local people and resources.

However, if the project had done this, it may not have been a good educational one. The interdisciplinarity of this project, combining disciplines involved in humanitarian aid – anthropology, economy, geography, etc.- and architecture is what makes this project challenging, perhaps too challenging for MIT students that aren't specialized in humanitarianism.

Conclusion edit

The humanitarian architecture project led by the MIT students shows how a humanitarian project can reveal itself to be an imperialistic legacy. For instance, the project was undertaken for the personal benefit of the volunteer-tourist students. Local professionals were not included in the project, which led to an improper understanding of the needs of the pastoral population of Turkana and short-term benefit from the clinic. Thus, such poor consideration for the local culture and actual humanitarian emergency perpetuates stereotypes and westernized perceptions over vulnerable and poorer regions.


  1. [16] Lenin 1916. Imperialism, the Highest Stage of Capitalism – chapter 7. Available from:
  2. [17] Kautsky 1914. Ultra-imperialism. Available from:
  3. R Young 2001. Postcolonialism: a historical introduction. Blackwell: Oxford.
  4. E Charlesworth. August 2014. Humanitarian architecture: 15 stories of architects working after disaster.
  5. Easterly, W., 2006. The white man's burden: Why the West's efforts to aid the Rest have done so much ill and so little good, Oxford University Press, Oxford.
  6. Nutt S. Damned nations 2011. Greed Guns Armies and Aid. 077105145X
  7. Bussell, H. and Forbes, D. (2002) 'Understanding the volunteer market: The what, where, who and why of volunteering', International Journal of Nonprofit and Voluntary Sector Marketing. Available from :
  8. MIT Architecture [Internet]. Jose Selgas; Spring 2014.Architecture Design Option Studio ⎯ UNmaterial. Available from:
  9. a b The Architectural Review [Internet]. TOMÀ BERLANDA; 23 February 2015. Shade of Meaning: Clinic in Turkana, Kenya, by Selgas Cano, Ignacio Peydro and MIT students. Available from:
  10. [18], Brown S. Travelling with a Purpose: Understanding the Motives and Benefits of Volunteer Vacationers. Current Issues in Tourism. 2005;8(6):479-496. Available from:
  11. a b [19], Coghlan A, Fennell D. Myth or substance: An examination of altruism as the basis of volunteer tourism. Annals of Leisure Research. 2009;12(3-4):377-402. Available from:
  12. [20], Neumann R. Ways of Seeing Africa: Colonial Recasting of African Society and Landscape in Serengeti National Park. Ecumene. 1995;2(2):149-169. Available from:
  13. [21], Voluntourism as a Form of Neo-Imperialism: Commodifying and Infantilizing Developing Countries to Perpetuate Global Hierarchies [Internet]. Un pas à la fois. 2013 [cited 5 December 2018]. Available from:
  14. Hugo Slim. Humanitarian Ethics: A Guide to the Morality of Aid in War and Disaster. Oxford: Oxford University Press;2015
  15. Now This. Samantha Nutt; 13 April 2018. Samantha Nutt on the Problems of Volunteer Tourism. Available from:
  16. Jean Brainard. Nutritional status and morbidity on an irrigation project in Turkana District, Kenya. American Journal of Human Biology. 1990. Available from:
  17. Kenya information Guide. ANON. The Turkana Tribe Nilotic people of Kenya.
  18. United Nations World Heritage List. United Nations World Heritage List.
  19. NBC News. The Associated Press; 12/23/2007. Examples of failed aid-funded projects in Africa. Available from:
  20. Daily Nation [Internet]. 29 July 2013. Kajiado richest and Turkana poorest in new county ranking. Available from:
  21. Unicef [Internet]. 2014. Kenya at a Glance. Available from:
  22. EMERGENCY. Centre of excellence for Paediatric Surgery. 2018.
  23. RPBW Architects – Renzo Piano Building Workshop [Internet]. RPBW Architects. 2018. Available from:

Soviet Imperialism in Classical Music

Introduction edit

Cultural imperialism is the imposition of a dominant community’s lifestyle onto another community. It involves transforming the religion, customs, language, norms, and ideology of others to reflect that of the dominant society. A prominent example of this is the Soviet Union, demonstrated by the Sovietization of other cultures. One of the tools they employed to bolster their cause was the regulation of arts. Here we argue that the Soviet government actively tried to push music to reflect their imperialist ideology, creating music which was "socialist in content and national in form" Stalin[1](p.6).

Background edit

Previous scholars[1] of Soviet music have separated its progression into three phases. The first (1917–1921) was a reaction to the revolution with musicians attempting to build a new style of music reflecting the change in their society. It consisted of experimentation designed to contrast what had come before. During the second phase (1921–1932) Soviet music adopted a proletarian attitude, music became ‘of and for the masses'[2](p.1). The final phase (1932 onwards) saw the rise of socialist realism, which incorporated the perceived positives of both the former stages. A definitive definition of socialist realism is contested upon but it resulted in music as ‘ a function of new socialist society, realistically interpreted, and related to the historic evolution of the nation as a whole’[1](p.6).

Research into the relationship between the Soviet government and music has produced mixed results. Some researchers argue that the government was disinterested in shaping Soviet music[3] and that the system was actually a meritocracy[4]. Additionally, the continuance of western classics in the repertoire of musical institutes has been explained as instances of the Soviet ideology being manipulated to accommodate for personal taste[5]. If the application of Soviet ideology is so pliable, can the government truly be said to have played a major role in developing Soviet music?

On the other hand, due to its strong emotional resonance, music has the capacity to be a ‘mobilizing device'[4](p.104). This capability was not lost on politicians, with Lenin himself arguing that "Art functions in conjunction with the formation of social consciousness and influences the social-economic relations of society"[2](p.1). Hence, arises the need to consider the evolution of Soviet music in conjunction with the regime.

What evolution did classical music undergo under Stalin's political dominance? edit

Complexity in music of the ruling class. (Beethoven's Sonata No. 9, Op. 47, "Kreutzer Sonata": I, performed by Paul Rosenthal and Edward Auer)
Folk song for the masses. (Volga Boatmen's Song performed by Feodor Chaliapin (1873–1938) with orchestra accompaniment)

Firstly, communists promoted a new music style, social realism, which was more accessible to the masses. In fact, it was reported that workers were bored during Beethoven concerts[1]. According to the Russian Association of Proletarian Musicians' manifesto, the bourgeoisie's musical culture was too developed for the workers[1], who didn’t understand anything. They needed to feel emotions to be interested. Thus Soviet music never used counterpoint and the harmony was mostly homophonic[1]. Composers were asked to use folk themes, and Stalin later promoted very melodic and solemn music, as it was easier to follow. Shostakovich’s 2nd opera, Lady Macbeth of Mtsensk, was indeed forbidden in 1936 after Stalin saw it: the style was too dislocated[6].

Rage in Shostakovich's 5th Symphony (Scene from the documentary Nelsons No. 5)

Thus, the party needed to control and motivate performances and composers, even if controlling musical codes is impossible. In 1932, the Union of Composers was created, which could ideologically control composers[7]. Communists forbade performances which weren’t socially realist, and the main communist newspaper, Pravda, often published violent articles against composers considered as anti-socialist. For example, two days after Stalin saw Lady Macbeth of Mtsensk, a ruthless article was published in Pravda against Shostakovich[6]. Furthermore, the party used terror to force composers to compose rightly. Shostakovich feared being deported so much that he stood for hours in front of his door to hide it from his family[8]. Moreover, a political instructor was assigned to him[8]. He called his 5th symphony “answer of a composer to right critics”. While this symphony was seen as an ode to the party, musical cyphers allowed Shostakovich to encode a parody of what was asked of him: the final movement is a parody of apotheosis, full of rage and hatred[8][7]. Soviet composers enjoyed, however, the highest social status under the Soviet regime. They had economic facilities like houses and they were paid at least 10,000 rubles for symphonic works[1]. Composers were given incentives to carry on with their compositions, through a system of government rewards. In 1941 Shostakovich received the "Stalin prize" of 100,000 rubles for his piano quintet[1]. Those rewards were excellent incentives, as composers were starving and would compose for food during the revolution.

Furthermore, certain compositions were not included in the repertoire for display, since they were not synchronous with the Soviet ideology (i.e. Tchaikovsky’s Overture 1812, or Stravinsky’s late compositions in the religious or neo-classical style). Likewise, older composers were required to undergo an ideological change in order to be relevant to the emerging style of Soviet music. One good example would be Sergei Vasilenko, with his Red Army Rhapsody, or Maximilian Steinberg, who dedicated an entire symphony (Turkish) to the development of the railroad linking Siberia and Turkestan[1]. Newer works incorporated political connotations, hinting at the supremacy of the USSR. Prokofiev never lost contact with Russia though he spent 15 of his most productive years abroad. One of his most successful works is Peter and the Wolf, the “symphonic fairy tale”, which can be interpreted in a political key (the wolf is Adolf Hitler). The outbreak of the Second World War influenced composers to write works which were linked to the events: Shaporin wrote the cantata Tale of the Battle for the Russian Land, which is split into 12 sections illustrating the war (i.e. (5) Song of a Red Army Man); Prokofiev wrote the cantata Ballad of the Unknown Boy, glorifying the guerrilla activity on Russian territory, and completed the opera after Tolstoy's War and Peace, hinting at the war against Germans[1][9]. Because of the new melodic style of Soviet music, operas and ballets naturally became more sought after than symphonies. The transition was tedious, being represented by lots of failures: iconic examples include Deshevov's Ice and Steel, which deals with the Kronstadt revolution of 1921, and Gladkovski’s For Red Petrograd, in which the libretto was centred on the White Army campaign in 1919. The first real success was Dzerzhinsky’s Quiet Flows the Don, which attracted full houses, and received an important endorsement[9]. Dzerzhinsky said “Comrade Stalin said that the time was ripe for the creation of a classical Soviet opera […] Such an opera should make use of all the latest acquisitions of musical technique, but it should above all strive to maintain closeness to the masses, clarity, and accessibility”[1](p.10), signifying the adherence to the Soviet ideology. Several classical operas were redone, in the light of being presented with a new libretto (i.e. Glinka's A Life for the Tsar was reworked as Ivan Susanin[9]).

Conclusion edit

In conclusion, the imperialist regime manifested itself in all disciplines, including arts. Even though music is boundless, the communists clearly showed that it can be steered in a certain direction. While it cannot be said that the regime's actions were righteous, one thing is certain: Soviet music was thriving – Soviet musicians dominated competitions, and composers had their works displayed worldwide[10]. The new music showcased a modern, melodic style, and was socialist in its themes.

References edit

Imperialism in the 'War on Drugs'

Introduction edit

Imperialism, broadly defined, is the extension of power by one state or institution over another, typically involving unequal power relations in cultural, economic or political exchange. Historically, the term evolved into its current use in the context of colonialism, particularly that of the British Empire, although Edward Said makes a distinction between imperialism and colonialism.[11]

This chapter seeks to explore the Bolivian coca eradication campaign from the paradigm of imperialism. It argues that the funding and execution of coca eradication by the United States government is a form of modern imperialism that implements American policies to solve an American problem at the expense of and without consultation with local stakeholders.

Historical context edit

History of Coca Cultivation in Bolivia edit

Erythroxylum coca, a tropical shrub used in the production of cocaine, has been cultivated in the Andes for centuries and has significant cultural and economic value to the region. As one of the first trade goods in the Andes, it dulls pain, hunger and fatigue, facilitates digestion, provides vitamins and minerals, and is used in medicine and rituals.[12][13] The leaves also have a traditional significance in social exchanges, gathering, and marriage.[14]

The upsurge in U.S. demand for cocaine--and hence coca--in the 1980s came during a period of economic crisis in Brazil amidst a political transition. Coca became Bolivia's most viable commodity and the coca-cocaine economy became an essential economic stabiliser that boosted national reserves and inward investment, providing security to miners and farmers displaced from contracting economic sectors[15]: the population of Chapare, which became a major coca-growing region in response to rising demand, increased from 40,000 in 1980 to 215,000 in 1987, while the coca acreage grew from 16,370 to 51,798 hectares.[16]

The American 'War on Drugs' edit

The term 'War on Drugs' refers to the United States' policy on banned narcotics, which was intended to “stop illegal drug use, distribution and trade by increasing and enforcing penalties for offenders”[17], and was popularised in 1971 by President Richard Nixon.[18]

President Nixon’s speech succeeded the Anti-Drug Abuse Act passed in 1986, in turn catalysed by the appearance of crack cocaine–a highly addictive form of cocaine that began widespread use in 1984 and 1985.[19] Cocaine-related hospital emergencies increased by 12% in 1985, and 110% in 1986.[20] The first Act was followed by a second in 1988, which laid out stricter regulations on crack in particular.

In this period, American foreign policy shifted to source-control, following that a significant enough cut in supply would undercut the market. This saw large amounts of monetary aid being channelled to the Bolivian military for eradication efforts.[21]

Coca eradication as American imperialism in Bolivia edit

US leverage in Bolivia edit

Clashing perspectives on the drug trade edit

Part of the conflict arose from the divergent perspectives apropos the growing drug trade: the U.S. pushed for a militarised solution in response to a growing amount of crack-related violence but were met with relative ambivalence. Bolivia lacked widespread drug violence, and the issue was not viewed as one of national security; there was a larger concern at the time with the political transition.[15]

There was emphatic local resistance to the eradication, and slogans such as ‘coca for development’ and ‘coca is not cocaine’ emphasised coca’s importance for rural communities.[15] The U.S., on the other hand, were frustrated with Bolivians for harbouring “denial regarding virtually any other aspect of the drug problem in Bolivia other than coca cultivation”.[22]

Political leverage via economic aid edit

Yet, given its weakened economic state, Bolivia was dependent on U.S. support for recovery, and were influenced by fears of being labelled a ‘narco-state’ and international pressure to comply with counterdrug efforts.[15] In 1983, Bolivia accepted $53 million in U.S. development aid, conditioned on achieving eradication targets of 500 hectares a year[23] and ‘satisfactory’ cooperation with antidrug efforts.[24] Attempts to appease U.S. concerns resulted in the approval of U.S. military involvement in counterdrug operations such as Operation Blast Furnace in 1986, and the passing of a new drug control law in 1988. With U.S. economic assistance tied to counterdrug cooperation, Bolivia could not afford the sanctions entailed by the alternative.[15]

A similar power relation played out with the 1989 Andean Initiative, which granted Bolivia, Columbia and Peru additional funding conditioned on acceptance of militarised efforts and renewed eradication targets and continued to favour enforcement over alternative agricultural and economic development.[15]

Facing resistance to the involvement of the Bolivian military, then-U.S. ambassador Robert Gelbard used influences with former Cold War allies to push the strategy, asserting that President Pat Zamora’s ‘mismanagement’ and ‘lack of clear leadership and decision-making ability’ jeopardised US economic assistance. The agreement was eventually signed under U.S. embassy pressure and for the release of development aid. The U.S. then began to bypass government officials they presumed corrupt while planning operations, and looked to remove these individuals. Notably, the U.S. succeeded in deciding the appointment of a key antidrug task force leader in their favour by threatening to remove aid,[15] demonstrating their ability to exert influence over the Bolivian government.

Consequences edit

Success, therefore, came at expense to the poorest stakeholders in the production chain, leading to social upheavals, falling property markets and widespread unemployment. Attempts to promote alternative crops, including a $32 million United Nations expenditure, largely failed. Coca, which accounted for over 90% of agricultural income in Chapare, offered higher returns than any other crop as it grew extremely well, was pest resistant and produced multiple annual harvests. Furthermore, it was easily transported and did not perish as quickly as crops like banana and pineapple. Those who did turn to alternative crops often found an insufficient market demand and ended up in debt.[25]

Furthermore, the heavy military presence made the population vulnerable to abuses of power. Multiple human rights watchdogs reported U.S.-funded soldiers stealing goods, burning buildings, and torturing coca growers.[26]

Ineffectiveness at curtailing cocaine trade edit

Source-eradication is a largely inefficient policy tool, generating only $0.15 in benefits per dollar investment as opposed to $7.5 per dollar for treatment and education policies.[27] The consequences of the coca eradication campaign have greatly outweighed any benefits; the basic efficacy of the campaign is questionable, since the suppression of one coca producer only results in another filling the vacuum. Despite significant reductions in coca cultivation in Bolivia and Peru, the eradication had essentially no effect on supply as Colombia took over as the largest producer.[25] Cocaine consumption in 2001 was almost the same as it was in 1997.[28]

There was also virtually no impact on the price of cocaine, largely because the cost of coca leaves comprises less than 0.5% of the street price.[25] The bulk of the cost comes from the other chemicals used in the processing, supplied largely by North American and European companies.[15]

Moreover, given the lack of viable alternatives, growers are likely to begin large-scale replanting, particularly if a similar suppression programme is successful in Colombia.[25]

Conclusion edit

The U.S. coca eradication campaign in Bolivia was an imperialistic campaign insofar as it exerted influence over Bolivia's economy and politics from a dominant geopolitical standpoint, utilising economic aid as leverage. Furthermore, the campaign was largely ineffective in addressing the cocaine trade, and was carried out at the expense of a dependent state.

References edit

Imperialism in Museums

Introduction edit

Imperialism broadly denotes an expansion of a state’s power involving territorial, political, or economic control over other areas or peoples. Contemporary understanding of imperialism primarily refers to the dominance exerted overseas by European empires in the 19th and 20th century.[9] Most generally, imperialism is associated with a power imbalance between two states.[29] Doyle characterizes imperialism as a relationship between a dominating metropolitan centre and the peripheral territory it controls.[30] Profound inequality between metropole and periphery is implicit in this definition, which has been employed by scholars such as Edward Said.[31] Furthermore, imperialism can exist without direct rule over foreign territories, instead it can exercise control in political, economic, or social spheres.[9] Notably, Said outlines how imperialism can be generated through culture and knowledge[32] by conceptualizing the colonized population as an inferior ‘other’, thus reinforcing a dichotomy of identities.[33]

Imperialist legacy remains present in museums around the world and is thus an important issue to address. Art historians such as Alice Procter, who established the 'Uncomfortable Art Tours', aim to challenge how information is displayed in museums and to highlight their imperialistic nature demonstrating the contemporary relevance of this issue.[34] Accordingly, we will observe the way in which museums are still used to create divisions in people's minds. This subconscious division, as conceptualised by John Willinsky, is between different cultures, based on race, religion or gender.[35] This claim is supported by Galtung's findings regarding the psychological effect of imperialism on human behaviour. His study suggests that imperialism creates a differing basic psyche between students from countries that benefit from imperialism, and students coming from negatively impacted countries. Students from countries which benefit are more inclined to be autonomous, whilst students from negatively impacted countries are more dependent.[36] It is this division that maintains the strength of imperialist ideologies within a society, one group believing they are superior to another and so should dominate. We will argue that museums, through architecture, categorisation, displays and appropriation, are one tool used to help further enforce these psychological divisions.

Case Studies edit

To illustrate the existing imperialist nature of museums we will examine three museums and the way in which their particular features create the divisions within people’s minds that Willinsky describes.

The Great Exhibition edit

The Great Exhibition- Crystal Palace
Moving Machinery

As a past example, The Great Exhibition, a Victorian international exhibition of culture and industry was used to display the technological, political, and intellectual project of Western imperialist powers and, in doing so 'insisted on the perfectibility of all peoples (under European guidance)'.[37] The culture that was chosen to be displayed to the world was one of progress and innovation, as opposed to any parts of British culture which might tarnish Britain's reputation. This shows how institutions were utilised to assert the superiority of the Western, European or even 'white' culture, over other cultures which imperialists desired control over. The superiority of Western culture that was displayed, is one way in which divisions were created in people's minds between the 'superior' intellect of West versus East, or civilised peoples versus savages.

Imperialism in the Australian War Memorial edit

The Australian War Memorial

The exclusion of the frontier wars in the Australian War Memorial supports Willinksy’s argument as it creates a psychological division of the inferiority of the Australian aborigines and the British colonialists. The memorial was opened in 1941 and aims to commemorate the country's soldiers that died during the wars that make up its history. The museum primarily focuses on Anzac history and conveniently excludes the frontier battles between the Australian aborigines and colonial police, soldiers and settlers which serve as a historical example of colonialist racial violence.[4] Additionally, in 2018 the Australian government has spent over $600 million on a four year commemoration service known as Anzac 100 [38], which makes their failure to remember the aborigines even more severe.

As it is estimated that approximately 20,000 indigenous Australians died between 1978 to 1928 as part of the frontier conflicts,[4] one must question the reasons behind the museum's disinterest in the events. As argued by historian Michael MacKernan the museum's exclusion of the battles suggests that "the particular part of the story is too confronting or too uncomfortable" to be mentioned,[4] therefore creating even greater controversy around the subject. This omission therefore illustrates the presence of imperialism in the museum as it aims to disregard the racist nature of the British colonialists. Moreover, the exclusion of the battles doesn't display the oppressive quality of the violent settlement of the British colonialists in Australia. It thus indirectly portrays the settlers as superior as they are almost exempted of their crimes through the fact that they are not being mentioned. This has the effect of validating their actions and thus creating a division between the colonists and the aborigines.

Imperialism in the British Museum edit

The British Museum

The British Museum emerged as a direct consequence of politics, as successful colonial expeditions provided ‘exotic’ objects to be displayed as indicative of the Empire’s power.[1] Therefore incapable of being neutral, this exhibition space is intrinsically laden with imperialist implications which (whether purposefully or inadvertently) recontextualise the objects.[2]

Elgin Marbles- Classical Greek marble sculptures

In recent years there has been an increase in requests for the repatriation of artefacts made by former colonies, which at the time were unable to withstand “the original removal of historical objects”.[1] In contrast to other institutions, the BM has remained obstinate in its response to such requests, arguing the collection in its current constellation and location, permits maximum benefit for the most amount of people. In this way the BM maintains its position as an “appropriate custodian”[1] to these objects, subsequently implying the source nations as incapable of housing their own artefacts and dependent on Britain. As the political relationship between these states to Western powers has changed, however, the refusal to return artefacts, perpetuates feelings of suppression, serving as a constant reminder of their removal.[1]

To this extent denying repatriation reinforces the psychological divide in visitors as described by Willinsky as it prohibits interpretations by the source nations of their own artefacts, instead imposing an imperial, eurocentric lens on much of their cultural heritage.[1] This is further demonstrated by the general movement of objects from former colonies to Britain, the “civilised centre”, defining the former as peripheral and thus inferior to the latter.[2] As these objects are integral to the cultivation of national identity and pride, it follows that a separation from these objects could be detrimental to a nation’s self-worth.

Conclusion edit

The issue of imperialism in museums naturally incorporates the disciplines of politics, art history and anthropology. In order to better understand the extent to which imperialism is still present in contemporary museums we considered the psychological effects imperialism has. This was supportive of Willinsky’s point that imperialism creates divisions within people’s minds. We explored these divisions both with past examples, and in the contemporary world. It was found that in both old and new museums, whether intentional or not, a dichotomy between Western and ‘other’ cultures remains present. It seems important then that, like Alice Procter, we don’t consider imperialism as something of the past but continue to challenge the divisive narrative it continually presents us with.

References edit

  1. a b c d e f g h i j k l m n o p Slonimsky, N. Soviet Music and Musicians. The Slavonic Review, American series. 1944; 3(4): 1-18. Invalid <ref> tag; name ":0" defined multiple times with different content
  2. a b c d Kozlenko, W. Soviet Music and Musicians. The Musical Quarterly. 1937; 23(3): 295-304. Invalid <ref> tag; name ":1" defined multiple times with different content
  3. Mikkonen, S. Music and Power in the Soviet 1930s: A History of Composers' Bureaucracy. Lewiston: Edwin Mellen Press; 2009.
  4. a b c d e Nelson, A. Music for the Revolution: Musicians and Power in Early Soviet Russia. University Park: Pennsylvania State University Press; 2004. Invalid <ref> tag; name ":2" defined multiple times with different content
  5. Fairclough, P. Classics for the Masses: Shaping Soviet Musical Identity under Lenin and Stalin. New Haven: Yale University Press; 2016.
  6. a b Ashley, T. The Guardian. Too Scary for Stalin; 26/03/2004. Available from
  7. a b Il y a 100 ans la Révolution, France Culture (radio), 1917, ce que la Révolution a fait à la musique russe, 21/10/2017, available from
  8. a b c Une vie une oeuvre, France Culture (radio), Chostakovich – Celui qui a des oreilles entendra, 21/10/2017, available from
  9. a b c d e Frolova-Walker, M. The Soviet Opera Project: Ivan Dzerzhinsky vs. Ivan Susanin. Cambridge Opera Journal. 2006; 18(2): 181-216. Invalid <ref> tag; name ":3" defined multiple times with different content
  10. Tomoff, K. Virtuosi Abroad: Soviet Music and Imperial Competition during the Early Cold War, 1945–1958. Cornell University Press; 2015.
  11. Gilmartin, M. (2009). Colonialism/imperialism. In C. Gallaher, C. T. Dahlman & M. Gilmartin, Key Concepts in Human Geography: Key concepts in political geography (pp. 115-123). London: SAGE Publications Ltd doi: 10.4135/9781446279496.n13
  12. Allan C. The Hold Life Has: Coca and Cultural Identity in an Andean Community. Smithsonian Institution Press; 1988.
  13. Baulenas A. Coca: A Blessing and a Curse. National Geographic History. 2016.
  14. The coca leaf in Andean societies – Bolivia [Internet]. 2018 [cited 9th December 2018]. Available from:
  15. a b c d e f g h Gillies A. (2018) The Coca-Cocaine Economy, the US ‘War on Drugs’ and Bolivia’s Democratic Transition (1982–1993). Andean Information Network. Available from:‘War-on-Drugs’-and-Bolivia’s-Democratic-Transition-1982-1993.pdf
  16. Painter J. Bolivia and Coca: A Study in Dependency. Boulder, Colorado: Lynne Rienner; 1994.
  17. Editors. War on Drugs. 2017. Available from: [Accessed 8th December 2018].
  18. Richard Nixon. Special Message to the Congress on Drug Abuse Prevention and Control. 1971.
  19. Reinarman C, Levine H. Crack in America: Demon Drugs and Social Justice. Berkeley: University of California Press; 1997.
  20. DEA History Book, 1985–1990 [Internet]. 2018 [cited 10 December 2018]. Available from:
  21. Patton J. Counterdevelopment and the Bolivian Coca War. The Fletcher Journal of Development Studies. 2002;42. Available from:
  22. Continued Bolivian Waffling on Counternarcotics Assistance to the Army. National Security Archive – The George Washington University. Washington DC, 1990.
  23. Malamud-Goti J. Smoke and Mirrors: The Paradox of the Drug Wars. Boulder (Colo.): Westview Press; 1992.
  24. United States General Accounting Office. Drug Control: US-Supported Efforts in Colombia and Bolivia, Report to Congress. Washington DC; 1998.
  25. a b c d Kohl, B. & Farthing, L. (2007). The Price of Success: Bolivia’s War Against Drugs and the Poor. NACLA. [online] Available at:
  26. Human Rights Watch/Americas. Bolivia under pressure: Human Rights Violations and Coca Eradication. Human Rights Watch; 1996.
  27. Kennedy, M., Reuter, P. & Riley, K.J. A Simple Economic Model of Cocaine Production. Santa Monica; Rand Corporation: 1994.
  28. Jordan, S. (2001). Bolivians rebel over ban on ‘sacred’ coca. The Guardian. [online] Available at:
  29. Howe S. Empire: A Very Short Introduction. Oxford University Press; 2002.
  30. Doyle MW. Empires. Ithaca, NY: Cornell University Press; 1986.
  31. Said E. Culture and Imperialism [Internet]. New York: Vintage Books; 1994 [cited 2018 Nov 27]. p. 9. Available from:
  32. Duara P. Modern Imperialism. In: Bentley JH, editor. The Oxford Handbook of World History [Internet]. Oxford University Press; 2011 [cited 2018 Nov 26]. p. 8. Available from:
  33. Said E. Orientalism [Internet]. New York: Vintage Books; 1979 [cited 2018 Dec 3]. p. 39–46. Available from:
  34. Procter A. Museums are hiding their imperial pasts- which is why my tours are needed. The Guardian. 23 Apr 2018 [cited: 29 Nov 2018]. Available from:
  35. Willinsky, John. Learning to Divide the World. University of Minnesota Press; 1998.
  36. Eckhardt W. and Young C. Psychology of Imperialism. Peace Research. January 1975. Volume 7 42-44. Available from:
  37. Buchli, Victor. The Material Culture Reader. Oxford: Berg Publishers; 2001.
  38. Daley P. Australia's frontier war killings still conveniently escape official memory. The Guardian. 8 Jun 2018 [cited: 29 Nov 2018]. Available from:

Imperialism in Global Health

Global health – a “collaborative trans-national research and action for promoting health for all”[1] – is an inherently interdisciplinary issue, despite stemming primarily from medicine. Varying definitions imposed by disciplines involved like economics, politics and anthropology create conflict in the process of finding a solution and reaching a desired outcome. Clearly, economics and politics would dictate the organisation of funding to the most damaging health issues but the question remains whether quantitative or qualitative data is more useful in analysing these issues and concluding what is the most damaging. Anthropological studies are necessary when culture actively undermines scientific insight into medical problems through religious ideology or traditions, preventing accurate data collection. Indeed, mental health is just one example of a complex issue in which treatment and perception of differ significantly according to culture. Imperialism, particularly Eurocentrism, therefore affects the research underlying global health, particularly in terms of how data is collected and presented and leads to Western definitions of health being imposed onto the rest of the world.

Imperialism, having a pervading influence over several aspects of modern culture, can be defined in several ways. Kushar notes the underlying “principle of universality” which is inherent in empires, suggesting that the role of an empire is to impose a singular concept of civilisation and thus defining imperialism as designating “a rule over a large space and many peoples”.[2] This, however, seems to limit the definition of imperialism as applying to a physical empire. Whilst Galtung supports this to some extent, arguing that imperialism exists “particularly between the nations,” the view is also taken that imperialism can be seen more broadly as leading to two main issues: inequality and resistance to change.[3] Nuzzo, furthermore, argues that the “de-westernization, decolonization and re-westernization” calls for how imperialist legacies are viewed in modern-day to be rethought:[4] this broader perspective can be used to view how imperialism, particularly Eurocentrism – the lingering influence of Europe's presumed “cultural and moral superiority” at the height of its influence,[5] according to Heraclides and Dialla – remains deeply rooted in the study of global health.

There is a Western focus on global health in terms of the data collection and presentation of global diseases. A major player associated with global health is the World Health Organization (WHO) and a glance at the WHO's webpage reinforces this idea. In the “Top 10 Global Causes of Death”,[6] the breakdown for causes of death globally is almost identical to the breakdown for high-income countries. In contrast, diseases that plague low-income countries such as diarrhoeal diseases rank at an insignificant 9th in the global breakdown. Furthermore, another few leading causes of disease worldwide – also found to be most prevalent in high-income OECD countries – are mental disorders and substance abuse.[7] An economist may see no issue with the data but from an anthropological point of view, the imposition of supposedly global health onto non-Western countries is worrying. This imposition seems to be a product of domineering countries defining global health, but it is still important to note the particular challenges faced when treating mental health in less-developed countries. Uganda's Health Services Strategic Plan (HSSP) states that there are many challenges to the mental health system in Uganda. Close to a third of the population live more than five kilometers from the nearest health facility and there is a poor public transportation system, which is mostly unaffordable to the people requiring it. Furthermore, there are 43 different languages in Uganda and despite their similarities, interpreters recognise that there may be significant differences in values and beliefs. For example, individuals find it difficult to share experiences with caregivers about sex, violence and traumatic situations associated with mental health even when linguistic and cultural diversity is bridged.[8] These challenges may directly affect data collection for mental health. As such, despite the problem of Eurocentric data dominance in global health, it may not be due just to academic imperialism; other underlying factors in less developed countries may play a part as well. However, there is still an undeniable slant towards Western countries for global health. Whilst conditions like diarrhoea and infectious diseases cause fewer deaths, these conditions disproportionately ravage low income countries and simply placing it as a global cause of death not only obscures this fact, but also inhibits the institution of a more inclusive global health system.

In addition to the use of primarily Eurocentric data, Derek Summerfield argues that another problem with “global mental health” is the presumption that the Western notions and definitions of mental health can be translated into all countries.[9] The biomedical models behind mental health rely heavily on Western social and economical situations; this context may not translate into the non-Western world which carry vastly different philosophies and standards of living. This is illustrated through a cross-cultural study[10] of mental health beliefs and attitudes, in which Islamic ideology was found to stray from the Western diagnosis of mental distress, attributing them to supernatural causes instead. This anthropological view highlights the role culture and religion plays in both the diagnosis and treatment of mental health, negating the assumption that Western health models can simply be applied to every society. The impact of this Eurocentric lens on global health is reflected in the actions carried out by governments, non-governmental organisations (NGOs) and companies. The United Kingdom (UK) alone saw an increase of around US$2.5 billion in mental health research and services[11] in 2018, as compared to the grand total of US$3.1 billion funding available for malaria worldwide.[12] This disparity shows the ever-present influence of imperialism in today's world. It is important to consider that whilst there is significantly more funding for mental health issues in one country than the global outbreak of malaria, it is unfair to label this simply as imperialism. From a political point of view, it is not imperialistic for a country to prioritise their own perceived health issues. Similarly to the UK, Africa spends mainly on ailments that plague them – the budget for HIV/AIDS and Tuberculosis makes up the second largest proportion of Africa's healthcare budget.[13] Despite this, it is still apparent that Eurocentrism comes into play in global health. The recent 2018–19 WHO Budget Programme,[14] which proposes a decrease by US$5.7 million in the malaria budget and an increase by US$1 million in the mental health and substance abuse budget from the 2016–17 approved budget supports this from a funding perspective.

To conclude, Eurocentrism is undoubtedly commonplace in the research and funding that underpin global health, leading to conflicts arising between disciplines. As demonstrated, WHO presents mental health and substance abuse as worldwide phenomena even though they are primarily a problem in Western countries. This data presentation leads to Western definitions of health being imposed onto the rest of the world, disregarding non-Western culture and religion. Eurocentrism also sparks the debate between the disciplines involved in global health, as political and economical perspectives disagree with the imperialist evidence, yet anthropological studies acknowledge the presence of Western domination in global health, offering a different understanding of health in non-Western societies. Hence, Eurocentrism remains a pressing issue in the perception of global health.

References edit

  1. Beaglehole, R. and Bonita, R. (2010). What is global health?. Global Health Action, 3(1), p.5142. Available from: [Accessed 25 November 2018]
  2. Kushar, K. (2011).  The Oxford Handbook of the History of Political Philosophy. Available from: [Accessed 28 November 2018]
  3. Galtung, John. (1971). A Structural Theory of Imperialism. Available from: [Accessed 28 November 2018]
  4. Nuzzo, Luigi. (2018). Rethinking Eurocentrism: European legal legacy and Western colonialism. Available from: [Accessed 29 November 2018]
  5. Dialla, A. and Heraclides, A. (2015). Eurocentrism, ‘civilization’ and the ‘barbarians’. Available from: [Accessed 28 November 2018]
  6. World Health Organization. (2018). The top 10 causes of death. Available from: [Accessed 26 November 2018]
  7. WHO. (2018). Chapter 7: Mental Health and Substance Abuse.  Available from: [Accessed 25 November 2018]
  8. Kopinak, J. (2014). Mental Health in Developing Countries: Challenges and Opportunities in Introducing Western Mental Health System in Uganda. International Journal of MCH and AIDS (IJMA), 3(1). Available from: [Accessed 27 November 2018]
  9. Summerfield, D. (2013). "Global mental health" is an oxymoron and medical imperialism. BMJ, 346(may31 2), pp.f3509-f3509. Available from: [Accessed 28 November 2018]
  10. Sheikh, S. and Furnham, A. (2000). A cross-cultural study of mental health beliefs and attitudes towards seeking professional help. Social Psychiatry and Psychiatric Epidemiology, 35(7), pp.326-334. Available from: [Accessed 29 November 2018]
  11. Gilburt, H. (2018). Mental health funding in the 2018 Autumn Budget.The King's Fund. Available from: [Accessed 29 November 2018]
  12. World Health Organization. (2018). Malaria. Available from: [Accessed 29 November 2018]
  13. UNICEF. (2018). Health Budget South Africa 2017–2018. Available from: [Accessed 29 November 2018]
  14. World Health Organization. (2018). Programme Budget 2018–2019. Available from: [Accessed 1 December 2018]