The Beauty Algorithm: The New Tool of Manipulation
Technology is taking us to places where we don't want to go.
-Sherry Turkle, founding director of the MIT Initiative on Technology and Self 
We are living through the technology era; everything is being or has already been changed by technology. Now technological evolution is attempting to tell us how we should look. Beauty algorithms such as the ones created by the technology company Megvii and the "facial aesthetics consultancy" Qoves have already begun to judge our looks based on parameters such as "symmetry, facial blemishes, wrinkles, estimated age and age appearance, and comparisons to actors and models." Meanwhile, to get more likes and followers on social media platforms, we have to use filters to “better” the way we look.
The problem with filtersEdit
That constant use of filters is especially unfavorable because it makes people addicted to filtering, and consequently, they don’t feel beautiful when they don’t use them. As Dr. Amy Stater, deputy director at the University of West England’s Center for Appearance Research, explains, “[t]he concern is that consistent exposure to a perfect image will leave people feeling like they don’t measure up.” In reality, filtering one’s pictures every day or seeing perfect images of others makes people insecure and makes them feel less comfortable and beautiful naturally. People always feel the constant need to retouch their pictures or videos and at some point will want to feel permanently “beautiful,” which in some cases has led to unnecessary cosmetic surgeries that sometimes lead to death. Take the case of Solange Magnano, a former Miss Argentine who died in 2009 after a “complication arising from plastic surgery,” as reported by CNN's Marc Tutton.
The problem with beauty algorithmsEdit
The use of beauty algorithms by every social media site is intentional. Beauty algorithms were made to make us insecure about ourselves so that we will need assistance to “perfect” ourselves. As Joshi Naveen explains in “What AI is doing in the beauty and cosmetic industry," beauty algorithms help companies to increase their sales and profits. Beauty algorithms happen to be an “illusion of perfection,” which hides what digital director, writer, and creative consultant Brooke McCord calls a “disturbing truth.” She explains how the beauty algorithm is trying to convince us that it can better our looks and life, but the reality is that it is just being used to brainwash people to manipulate them. Facebook and other social media platforms are already doing this by selling us the entertainment dream to be able to spy on us, get all the information they need. At the same time, we are connected to their platforms, and after selling our data to third parties without our acknowledgement and making money on our backs. That is exactly what Jeff Orlowski explained in his 2020 documentary The Social Dilemma.
Beauty algorithms are affecting us physically, emotionally and mentally. We are now more stressed about how we look and are desperately trying to look like the “perfect” faces and sharpened bodies we constantly have been shown by social media. In the podcast interview “The AI of Beholder” by Jennifer Strong, she explained that Computers are ranking the way people look—and the results are influencing the things we do, the posts we see, and the way we think. Ideas about what constitutes “beauty” are complex, subjective, and by no means limited to physical appearances. Elusive though it is, everyone wants more of it. That means big business and increasingly, people harnessing algorithms to create their ideal selves in the digital and, sometimes, physical worlds. During the report, Strong and everyone she interviews agrees on the same fact that beauty is a vast industry that is ready to make profits off consumers at any cost.
Behind the designEdit
Beauty algorithms and technological inventions in general are designed by a small group of people mostly white and or asian who don’t know much about other societies' cultures. They innocently acknowledge or rely only on their own culture.And, their judgments and point of view always show up in their inventions. That clearly explains whyThe assistant Professor of Information system at Robert H. Smith School of business in college Park, Lauren Rhue,Ph.D , Found out that facial recognition programs exhibit “Two distinct types of bias”, the stereotype and racist. In fact, she demonstrated how the facial recognition software failed to give real and objective results for each of the 400 NBA player photos from the 2016 to 2017 season she used for the experiment. Rhue believes The emotions reading Tech reflects stereotypes and prejudices since it scored black faces angrier than white faces even when they both had the same smile. She also attested that facial recognition programs as well as beauty algorithms are coded in favor of one group (whites), excluding others, especially blacks.
Why would a program designed to be used by everyone focus only on one group of people and ignore other people’s reality during the coding process? Well, the best way to destabilize and dominate a group of people is to make them doubtful about who they really are. By using beauty algorithms , racism is being implanted into our everyday lives. We are so confused about ourselves that now what we used to call “beautiful” is progressively becoming “ugly” to us. I remember, growing up in West Africa, plus size women were the most “beautiful” in the eyes of most people. To them, being plus size showed how well-fed and healthy people are. Yes, it was not always a good fact for skinnier people, but that is how it was. Years later, when TV started to broadcast more European movies, promoting skinny as sexy, the ideal of beauty started to change.
Men started to prefer skinnier women and the plus size beauties became “unwanted.” I also remember in my younger years, I used to hear people say “Wow! This person is so dark! She is so black, her skin is so beautiful!”As the years went on and the cosmetic industry promoted skin bleaching products, being black became a shame; it now shows that you don’t have good body hygiene. For example in the Pears’ soap advertisement based on the “washing the blackamoor white” it was said that black skin is dirty and needs to be washed off to become clean, white. From this, we can say imperialism has never really stopped, the heir to imperialism just made imperialism subtle. Like old Africans like to say, “The dog will never change the way he sits.” The saddest part is that the younger generation are now thinking that it’s not normal to be natural. Most of them don't know about imperialism, racism, and or slavery so it make it easier for imperialismst hidding behind technologies and progress to manipulate them. I have a sad memory of a young lady in my high school that boys used to bully by calling her “midnight” just because she was very black.
The positive sideEdit
We all know that when opening up to the world, a person is opening up to diversity, and I believe diversity is the most beautiful and essential part of human history; it’s not always easy to be fair or accept everyone’s values and or ways to live, but as a globalized society we must take each other’s differences in a way to make the world a better place to live in. Beauty algorithms, filters, and cosmetic surgeries are indeed having a very negative impact on our modern society by causing self-consciousness, worry, and pessimistic views of our bodies. However, they have some agreable aspects in our everyday lives. Filters make our interactions on social media more interesting and relaxing, contributing to our well-being. In fact, filters Like Sun Baby, Shrek, Disney Pixar Face, and Moth Everywhere were first introduced to us for fun purposes and they are still being used for that purpose on platforms like Instagram,TikTok and Snapchat. These filters are so funny and entertaining that Lauren Webber, popularly known as LaurenzSide, an American online gamer, commentator, YouTuber, TikToker and social media star, decided to try them during videos for her followers.
In her video called “Trying the WEIRDEST Instagram filters ever made,” she tried some filters on herself. She was happy and relaxed during the process and honestly I was also enjoying it while watching the videos. We have too many severe topics like Coronavirus, cancer, mass shootings, climate change, and all the disasters that come with it, so we need amusing things like emojis and filters to relax us. Also, beauty algorithms and plastic surgery could be used to give back smiles to people after accidents or disasters where a person has lost one or several body parts. Many women in the world have been victims of acid attacks that completely destroyed the look of their faces and bodies. Plastic surgeries and beauty algorithms may be the only options they may have to get a better look and make them happy again. I agree that we “are beautiful just the way we are” and “we don’t have to change a thing, the world should change its heart,” like Alesia Cara says in her song “Scars to Your Beautiful,” but if you think you need to change a thing to be happy, you should be given the chance to do so because the only real reason for living is happiness.
To conclude, it can be said that overall, technology has helped us have a better life. However, our new way to live is not always what we plan it to be. We are being dragged by “monsters” hiding behind technology to make profit from us. We should now know that everything “new is not always good” and be careful with what we believe in.
- Turkle, Sherry. “Connected, but Alone?” TED.COM, February 2012.
- Wikipedia contributors. "Beauty.AI." Wikipedia, The Free Encyclopedia. Wikipedia, The Free Encyclopedia, 11 Feb. 2021. Web. 15 Dec. 2021.
- Yaseen, Rene.“Changing Your Face for Your Following:The Implication of Tik Tok ‘Beauty Algorithm’”, Observer.com, March 15 2021
- Marc, Tutto, "Model's death highlights plastic surgery risks”,Cnn.com,December 02,2009
- Naveen, Joshi, “What AI is Doing in The Beauty and Cosmetic Industry” Allerin.com, 08 June, 2020
- Strong, Jennifer. “The Al of the Beholder”. In machine we trust, Podcast, YouTube, April
- Rhue, Lauren. “The Emotions Reading Tech Fails the Racial Bias Test” The conversation.com January 13, 2019.