A
A

International experts at war against AI?

Wed 21 Feb 2024 ▪ 4 min of reading ▪ by Luc Jose A.
Getting informed Event

AI technology is advancing by leaps and bounds and revolutionizing many fields. However, it comes with numerous disadvantages. Deepfakes are among the dangers of this technology. In the face of the proliferation of these fake digital contents, experts are sounding the alarm and calling for stricter legislation.

IA : Des experts appellent à une législation stricte pour limiter les dangers des deepfakes

An appeal for strict regulation to protect digital integrity

Current artificial intelligence legislation does not sufficiently regulate the production and dissemination of deepfake content. This is the complaint of a group of international experts in an open letter entitled “Disrupting the Deepfake Supply Chain” published on February 21st. These researchers invite legislators to adopt strict measures to put an end to the distribution of deepfake contents, notably images and videos of a sexual nature or aimed at political misinformation.

In the group of experts, we find thought leaders such as Andrew Yang, Stuart Russell, Joy Buolamwini, Sarah Gardner, Gary Marcus, Steven Pinker, Sneha Revanur, and many other influential personalities. In this letter, the experts propose concrete recommendations to limit the multiple negative consequences of these AI-generated contents.

They urge legislators to criminalize pornographic deepfakes and to provide penalties for the creators of this harmful content. The experts also demand that AI software developers and distributors take action to prevent the creation of these deepfakes.

A response to the proliferation of these fake recordings generated via AI

Deepfakes have literally exploded in 2023. The Home Security Heroes website estimates over 95,820 deepfake videos generated via AI and broadcast over the past year. This represents an increase of more than 550% compared to the numbers in 2019. Pornography makes up a significant proportion of this content, accounting for 98% of the deepfake videos generated in 2023 on its own.

In the United States, for example, several deepfake contents have flooded social media in recent weeks. False pornographic images of the famous singer Taylor Swift first stirred the X social network. The published content was viewed by over 45 million netizens and registered hundreds of thousands of likes. Another pornographic video of certain high school students from New Jersey generated with the aid of AI also went viral on social networks.

Finally, an audio deepfake impersonating the voice of President Joe Biden was made to prevent Democratic voters in the state of New Hampshire from turning out to the polls. These events underline the urgency of the situation and the importance of legislation that strongly condemns the production and dissemination of these fake contents created with the help of AI.

Maximize your Cointribune experience with our 'Read to Earn' program! Earn points for each article you read and gain access to exclusive rewards. Sign up now and start accruing benefits.


A
A
Luc Jose A. avatar
Luc Jose A.

Diplômé de Sciences Po Toulouse et titulaire d'une certification consultant blockchain délivrée par Alyra, j'ai rejoint l'aventure Cointribune en 2019. Convaincu du potentiel de la blockchain pour transformer de nombreux secteurs de l'économie, j'ai pris l'engagement de sensibiliser et d'informer le grand public sur cet écosystème en constante évolution. Mon objectif est de permettre à chacun de mieux comprendre la blockchain et de saisir les opportunités qu'elle offre. Je m'efforce chaque jour de fournir une analyse objective de l'actualité, de décrypter les tendances du marché, de relayer les dernières innovations technologiques et de mettre en perspective les enjeux économiques et sociétaux de cette révolution en marche.

DISCLAIMER

The views, thoughts, and opinions expressed in this article belong solely to the author, and should not be taken as investment advice. Do your own research before taking any investment decisions.