For many years, with the growth and rise of technology, the term “deep fakes” has become a main focus in popular culture. Spreading as fast as a fire, fake news not only grew larger on the internet, but it also affected the consumers that absorbed it. Before I continue, I would like to further explain what exactly a deep fake may be. Deep fakes come in a variety of forms but are defined as either a video or audio recording, that is created to look and sound like the original piece. Influenced by artificial intelligence, technology is used in many ways to edit facial expressions or any recordings. An easier definition: any video/picture that may have been altered from the original content. Although deep fakes were known to be created by highly intelligent computer scientists or technologists, that was no longer the case. With the growth of technology, it has become much easier for the public to gain access to the manipulation of deep fakes.
When hearing the term “deepfake”, many individuals are quick to assume many negative assumptions. This can be understood, as manipulating an original piece can cause some controversies and uncalled rumors. For instance, as of 2023, one of the most viewed and shared deepfake include former ex-president Donald Trump and current President Joe Biden playing videogames online. Many influencers on TikTok were not only able to manipulate what the Presidents were saying to each other, but they were also able to express just how toxic a conversation is, in real-life scenarios, when playing online. With the conversations including foul and inappropriate language, many consumers were confused on what was real and what was not.
As deepfakes have become overpowered by negative assumptions, many individuals tend to forget the positives that may come with it. Some individuals, especially in the art community, use deepfakes in order to create art pieces that engage with the consumers in a “never has been done” way. Focusing on trying to make the image or art piece come to life, technologists were able to redesign certain famous pieces. An example of one of these famous artworks includes the Mona Lisa. Many individuals were able to edit her face, making any facial expression they wanted her to. Some were also able to edit her mouth to match up with whatever they wanted her to state. Something that had become popular was the idea of Mona Lisa stating powerful quotes, supporting Feminism, or anything to uplift the female overall. Another benefit of a deepfake includes the idea of being able to see or meet an individual in person, if they are deceased. For example, the Dali Museum in Florida allows individuals to meet the deceased artist, Salvador Dali, at a certain price.
As years goes by and technology continues to grow, the future of deepfakes could create either side of consequence; positive or negative. While deepfakes are known for the negativity it can be, focusing on how engaging and important of a tool it could be, could benefit any art community.
4 thoughts on “The Future of Deep Fakes”
This topic of deepfakes is really interesting. While they can be used to make harmless silly videos that everyone knows is fake, they are blurring the line between artistic expression and infringing on people’s right to consent. Something that is becoming increasingly popular is the occurrence of deepfake pornography – where typically celebrities faces are superimposed onto a adult film star’s body. I find it so so invasive and disgusting because those people did not consent to their image being used that way, and I have been made aware that it happens to child actors like Milly Bobby Brown. I don’t know the legal stuff regarding it, but I hope that if there aren’t laws protecting people from deepfakes being used this way, that there will be soon.
I can see why deep fakes became trendy and popular as fast as they did. I personally do watch the videos of the presidents playing video games and find it very funny because as stated in the article they predict phrases that they would say to each other in a funny manner. I feel like it is offensive but entertaining. I think that it should be limited and that not everyone should have access to make deep fakes. I feel like it can be abused and/or misused.
This is a very current and relevant topic with both positive and negative consequences. I appreciate you arguing for both sides in your article, as AI and deep fakes are not entirely bad, and have allowed for positive advances in many fields and industries. When you talk about deep fakes allowing viewers to better interact with art, I found it interesting how the idea of expression is better portrayed in traditional paintings using technology to its aid. One negative aspect of deep fakes that I would like to touch on is its unrestricted access on public platforms. Deep fakes have been used by people to create pornographic images or videos of people performing sexual acts. These images and videos have been used to blackmail, and defame individuals. Another negative of deep fakes have been used to scam families into giving perpetrators money. In many instances younger relatives’ voices are copied into a very convincing deep fake voice which is then used to make a distress call to a person’s family begging for money to be released. With heightened anxiety, these scams have been successful in stealing money from people’s families without having to physically kidnap or cause them harm. With the rise of new technology, everyday people need to stay up to date on new advances to keep themselves safe and informed.
When I first heard about deep fakes back in 2019, I thought it was all fun and games creating videos or audios of our favorite celebrities, mocking them in a funny way. But as our technology evolves, I realized the danger it can truly be not only towards our country as a whole but individually as well. One case I heard (do not remember the victims name), they became a victim as their face appeared in sexual videos. As this was brought to the light, it was discovered that hundreds of celebrities were facing the same problem, and it makes me question on how many more other regular people this has happened too. Deep fakes is far more negative than positive.