Going deep on deep fakes
January 30, 2024 | Tags: REASON
This was a big week for AI-generated deep fakes. Sultan Meghji, who's got a new AI startup of his own, walked us through four stories that illustrate how AI will lead to more confusion about what's real and what's not. First, a fake Biden robocall urged people not to vote in the New Hampshire primary. Second, a bot purporting to offer Dean Phillips's views on the issues was penalized by OpenAI because it didn't have Phillips's consent. Third, fake nudes of Taylor Swift led to a ban on Twitter searches for her image. And, finally, podcasters used AI to resurrect George Carlin and got sued by his family for violating copyrightish law. The moral panic over AI fakery meant that all of these stories were too long on "end of the world" and too short on "we'll live through this."
Regulators of AI are not doing a much better job of maintaining perspective. Mark MacCarthy reports that New York City's AI hiring law, which has punitive disparate-impact disclosure requirements for automated hiring decision engines, seems to have persuaded NYC employers, conveniently, that none of them are using automated hiring decision enginess, so they don't have to do any disclosures. Not to be outdone, the European Court of Justice has decided that pretty much any tool to aid in decisions is an automated decision making technology subject to special (and mostly nonsensical) data protection rules.
Is AI regulation beginning to suffer from backlash? Could be. Sultan and I report on a very plausible Republican plan to attack the Biden AI executive order on the ground that its main enforcement mechanism, the Defense Production Act, simply doesn't authorize the measures the order calls for.
In other Big Tech regulation, Maury Shenk explains the EU's application of the Digital Markets Act to tech companies like Apple and Google. Apple isn't used to being treated like just another tech company, and its contemptuous response to the EU's rules for its app market could easily spur regulatory sanctions. Looking at Apple's proposed compliance with the California court ruling in the Epic case and the European Digital Market Act, Mark says it's time to think about price regulating mobile app stores.
Even handing out big checks to technology companies turns out to be harder than it first sounds. Sultan and I talk about the slow pace of payments to chip makers, and the political imperative to get the deals done before November (and probably before March).
Senator Ron Wyden, D-Ore. is still flogging NSA and the danger of government access to personal data. This time, he's on about NSA's purchases of commercial data. So far, so predictable. But he's also misrepresenting the facts by claiming flatly that NSA buys domestic metadata, ignoring NSA's clear statement that the metadata it buys is "domestic" only in the sense that it covers communications with one end inside the country. Communications with foreign countries that flow into and out of the U.S. have long been considered appropriate foreign intelligence targets, as witness the current debate over FISA section 702.
Maury and I review a Jim Dempsey's effort to construct a liability regime for insecure software. His proposal looks reasonable, but Maury reminds me that he and I produced something similar twenty years ago, that is still not even close to adoption anywhere in the U.S.
I can't help but rant about Amazon's arrogant, virtue-signaling, and customer-hating decision to drop a feature that makes it easy for Ring doorbell users to share their videos with the police. Whose data is it, anyway, Amazon? Sadly, I'm afraid we know the answer.
It looks as though there's only one place where hasty, ill-conceived tech regulation is being rolled back. China. Maury reports on China's decision to roll back video game regulations, to fire its video game regulator, and to start approving new games at a rapid clip—though only after a regulatory crackdown had knocked more than $60 billion off the value of its industry.
We close the news roundup with a few quick hits:
- Outside of AI, VCs are closing their wallets and letting startups run out of money
- Apple launched what looks like an expensive dud – the Vision Pro
- Quantum winter may be back as quantum computing turns out to be harder than hoped
- And, speaking of winter, consumers and regulators seem to be cooling fast on self-driving cars in the wake of serious missteps by Cruise
Finally, as a listener bonus, we hear from Rob Silvers, Under Secretary for Policy at the Department of Homeland Security and Chair of the Cyber Safety Review Board (CSRB). Under Rob's leadership, DHS has proposed legislation to give the CSRB a legislative foundation. The Senate homeland security committee recently held a hearing about that idea. Rob wasn't invited, so we asked him to come on the podcast to respond to issues that the hearing raised – conflicts of interest, subpoena power, choosing the incidents to investigate, and more.
You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@gmail.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug! The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.