Google’s video platform, YouTube, is rolling out several new policies aimed at curbing ‘deep-fakes.’ A ‘deep-fake’ video is content created using artificial intelligence (AI) to give the viewer the impression that a person is in the video who is really just a computer creation. One of the more common uses is to take someone performing in the video and then use computer software to digitally change their facial elements to resemble a famous actor or actress.
This has become a growing issue in recent years, and the recent actors guild strike was partially about combating deep-fakes. Tom Hanks recently took to his Instagram to call out an advertiser for using his likeness, and to let fans know that was not him but in-fact an AI created facsimile of the actor.
YouTube has issued new guidelines regarding this technology to all its creators. Additionally, the company is aiming to more clearly let viewers know that what they are seeing an AI creation. YouTube will apply labels indicating the use of AI if creators don’t clearly state that themselves.
YouTube’s VPs of Product Management, Emily Moxley and Jennifer Flannery O’Connor, posted online that “sensitive topics” will be especially regulated. They named elections, military conflicts, health issues, and public officials, as some of those topics. They stated that creators who continue to operate outside the guidelines will open themselves up for suspension and banning.
The company also made it known they have developed a new software called Dream Screen that very quickly searches through the popular “YouTube Shorts” for AI content and identifies deep-fakes. YouTube Shorts account for roughly 70 billion views a day from about two billion viewers on the platform.
CEO Neal Mohan believes there will be challenges in getting to all the AI content, but he says that in the end everyone on the platform is subject to the community guideline policies. YouTube would also like users who encounter deep-fake content that is not labeled properly to alert the company and ask for its removal.