Legal Scholars Dive into Implications of Deep Fakes

February 15, 2019    |  

“Imagine the night before an IPO, a deep fake video of the CEO comes out of the CEO soliciting a child prostitute or doing drugs,” University of Maryland Francis King Carey School of Law professor and privacy expert Danielle Citron, JD, said to a full house in the school’s Ceremonial Moot Courtroom.

“There goes the IPO, and the faith of the marketplace for the CEO is wrecked,” she continued.

Citron was the keynote speaker at the Maryland Law Review 2019 spring symposium, “Truth Decay: Deep Fakes and the Implications for Privacy, National Security and Democracy.”

 

"Privacy Implications of Deep Fakes" panelists at the Maryland Law Review Spring 2019 Symposium (bottom row l-r) Suzanne Dunn, University of Ottawa; Mary Anne Franks, University of Miami School of Law; Ari Waldman, New York Law School. (Top row l-r) Danielle Citron, Maryland Carey Law; Woodrow Hartzog, Northeastern University School of Law; Jessica Silbey, Northeastern University School of Law.

If you’ve never heard of a deep fake, you will, said Citron, who noted that “we are in a moment of pervasive disinformation.”

(Watch Citron's keynote address in the video below.)

For the uninitiated, the term deep fake refers to the digital manipulation of audio, images, or video to make it appear that a person did or said something they didn’t say or do in a very realistic way. The best deep fakes are undetectable and therefore hard to debunk. Well-known examples feature celebrities such as Gal Gadot and Emma Watson inserted into deep fake pornographic videos.

The latest technology called Generative Adversarial Networks (GAN) adds a frightening level of sophistication by using machine learning techniques that are incredibly hard to detect.

The “Deep Fake” symposium was inspired in part by “Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security,” a forthcoming article in California Law Review co-authored by Citron and Robert Chesney, JD, a professor at the University of Texas School of Law. The article provides a prescient first assessment of the causes and consequences of deep fake technology.

The event attracted some of the best minds in the legal and technology communities to participate in spirited panel discussions, including “The Privacy Implications of Deep Fakes” which featured Woodrow Hartzog, JD, and Jessica Silbey, JD, both of Northeastern University School of Law; Mary Anne Franks, JD, DPhil, MPhil, University of Miami School of Law; Ari Waldman, JD, PhD, New York Law School; and Suzanne Dunn, JD, PhD, University of Ottawa. 

Panelists for “The Role of Intellectual Property, Platforms and Free Expression Concerns, and National Security Implications” discussion included Stacey Dogan, JD, Boston University School of Law; Olivier Sylvain, JD, PhD, Fordham University School of Law; Kate Klonick, JD, PhD, St. John’s University Law School; and Thomas Kadri, JD, MA, Yale Law School. The panel was moderated by Carey Law professor David Gray, JD, PhD.

The third panel tackled the national security implications of deep fakes and featured Benjamin Wittes, a senior fellow at the Brookings Institution; Quinta Jurecic, managing editor, Lawfare; and Alan Rozenshtein, JD, University of Minnesota Law School.

The goal of the event, according to Citron, was to talk about the harm that deep fakes impose on individuals and society and then to “puzzle through together the modest way that law can intervene.”

And just how can the law intervene? “The law is a modest and blunt tool,” Citron admitted, “but we have to try.”

She ticked off areas of law that could be invoked in deep fake cases, including criminal statutes related to impersonation of government officials and fraud statutes. Citron and Chesney embrace the option of changing section 230 of the Communications Decency Act to make immunity conditional for internet platforms. Section 230 currently “provides immunity from liability for providers and users of an ‘interactive computer service’ who publish information provided by third-party users,” according to the legislation.

“Right now, it’s a free pass. It provides no incentives for platforms to protect the vulnerable,” said Citron, who added there are platforms whose business model is based on abuse and destruction. “They make money off of eyeballs. They make money when stuff goes viral.”

“At the end of the day, Bobby and I don’t have clear answers,” said Citron, referring to Chesney, her deep fake paper co-author. “That’s why we wanted to bring the smartest people together in one room to talk about the privacy, the free speech, the IP, the national security implications of these images and video and audio.

“I think the lesson is the law moves like a pendulum,” she said, swinging her arm back and forth. “We overreact, we underreact, and hopefully we end up somewhere in the middle.”