Tuesday, November 19, 2024
Business

Taylor Swift’s AI porn deepfakes have prompted a wave of proposals to protect women from abuse–but Congress and the states need to take action

Artificial intelligence (AI) might already be so out of control that it poses a growing threat to women’s safety? Non-consensual deepfake porn like the images of Taylor Swift that made headlines are only part of the story. Sexual exploitation and the normalization of violent imagery are on the rise, with AI opening up new ways to exploit and harass women.

My work as the second Black President of the National Organization for Women (NOW), the country’s largest intersectional women’s rights organization, as well as my personal background as a licensed social worker and mental health professional, makes it clear to me what must be done to stop the mental anguish and long-term damage happening online.

Congress and state legislatures are responding–but the questions remain: Are they acting fast enough? And how enforceable will their actions be?

The dark side of AI

There’s a new kind of thievery preying on women today–the theft of our bodily autonomy.  When it happens to people in the spotlight, it becomes a media sensation, but 99.9% of the victims of deepfakes and online sexual abuse can’t fight back the way a superstar can or wait for the damage to die down. Fake imagery causes real harm to reputations and self-esteem and shatters privacy. 

Women are the first to be exploited, attacked, and abused online in the most invasive ways possible–and with AI, the threat changes every day.  That’s why it’s so vital that we push lawmakers both in Washington, D.C. and the states to protect women from the dark side of AI.

The women-led dating and social networking app Bumble surveyed its community and found that one in three women have been sent unsolicited lewd images, and a 2021 study by the Pew Research Center similarly found that 33% of women under 35 reported being sexually harassed online.

What the law says

There are no federal laws that make it illegal to create or distribute deepfake pornography, although legislation to address this danger has been introduced. The Taylor Swift AI-generated images prompted a new wave of proposals and calls for action–but none have materialized seen so far.

Congressman Joseph Morelle (D-NY-25) along with 40 co-sponsors, has introduced the  Preventing Deepfakes of Intimate Images Act, which makes illegal the nonconsensual sharing of altered or deepfake intimate images online, prosecutes wrongdoers and makes them pay.

In the U.S. Senate, a bipartisan bill called the DEFIANCE Act would be the first federal law to prevent nonconsensual deepfake pornography.  Introduced by Senate Majority Whip Dick Durban (D-IL), along with Sens. Lindsey Graham (R-SC), Amy Klobuchar (D-MN), and Josh Hawley (R-MO), it acts on a provision of the Violence Against Women Act’s most recent reauthorization to prevent and prosecute cybercrimes. 

We must build public support for laws like these–and for state legislation as well.

According to Axios, practically all of the state legislatures working today have taken up AI-related legislation, of which nearly half deal with deepfakes. As of early February, there were 407 bills relating to AI, up from 67 one year earlier.

At least 10 states have passed deepfake-related laws, including Georgia, Hawaii, Texas and Virginia.  California and Illinois have given victims the right to sue. 

The conversations we’re having today among elected officials, government agencies, tech companies, and consumers will determine the impact AI have on people and society for generations to come. It took over a decade for government to understand the changes brought by social media platforms—but we can’t wait another 10 or even five years to take the next steps.

We need to update the rules and strengthen the online guardrails that were created when AI was a science-fiction movie prediction. Now that it’s here, AI is fueling a culture of toxic masculinity, misogyny, and abuse. We may not have long to seize this moment–and protect women from online abuse. 

Christian F. Nunes, MBA, MS, LCSW, is the president of the National Organization for Women.

More must-read commentary published by Fortune:

The opinions expressed in Fortune.com commentary pieces are solely the views of their authors and do not necessarily reflect the opinions and beliefs of Fortune.

Subscribe to the new Fortune CEO Weekly Europe newsletter to get corner office insights on the biggest business stories in Europe. Sign up for free.

source

Leave a Reply

Your email address will not be published. Required fields are marked *