Australia criminalizes distribution and creation of deepfake pornographic material

by Admin
Australia criminalizes distribution and creation of deepfake pornographic material

The Australian government will introduce legislation Wednesday that will make it a criminal offense to create and share deepfake pornographic images of people without their consent.

Attorney General Mark Dreyfus said sharing such images is a damaging and deeply distressing form of abuse.

A deepfake is an image or video in which a person’s face or body has been altered to make it appear they are doing or saying something that never happened.

Deepfake pornography overwhelmingly affects women and girls. Increasingly, it is being generated by artificial intelligence.

The Australian government said it will not tolerate such “insidious criminal behavior.”

Attorney General Mark Dreyfus said it’s a crime that can “inflict deep, long-lasting harm on victims.”

New laws being introduced Wednesday in Federal Parliament in Canberra create a new criminal offense that will ban the creation or sharing of digitally altered sexually explicit images without consent.

Offenders could be sent to prison for up to seven years.

Katina Michael is an honorary professor at the Faculty of Business and Law in the School of Business at the University of Wollongong.

She told the Australian Broadcasting Corp. that technology, including artificial intelligence, can help detect deepfake material.

“In essence, what we can do is detect deepfake videos,” she said. “They are literally special effects videos where the images have been manipulated frame-by-frame and, so, we can run videos through analyzers and digital platform providers can do that, social media providers can do that.”

She said while artificial intelligence facilitates the creation of deepfake pornography, it can also can be used as a deterrent.

Often celebrities are the victims of digitally altered material, but it is a crime that has affected many other people.

Earlier this year, fake images of the American singer Taylor Swift flooded the internet, with one sexually explicit image of the singer reportedly being viewed almost 50 million times.

The new legislation in Australia will only apply to deepfake sexual material depicting adults, with child abuse material continuing to be dealt with under dedicated and separate laws.

In April, Britain said it would bring in similar legislation to ban deepfake pornography.

In Australia, the new deepfake laws are part of a range of measures aimed at reducing violence against women and addressing the role that technology, including social media, plays in propagating degrading and misogynistic attitudes.

Source Link

You may also like

Leave a Comment

This website uses cookies. By continuing to use this site, you accept our use of cookies.