Opinion editor's note: Star Tribune Opinion publishes a mix of national and local commentaries online and in print each day. To contribute, click here.
•••
"The Crown" recently completed its run — and its reign — on Netflix. But if new episodes are ever produced, one would need to be about this week's royal photo scandal.
How the episode would end is still a cliffhanger, however — because a Photoshopped image of Princess Catherine and her three grinning children, publicly released with the intent of calming a controversy over her hospitalization, has only heightened it.
The modern monarchy is rife with "conflict between public and private lives," said John Watkins, a University of Minnesota professor of English. It's "a public institution that is really inseparable from private lives of the individuals who hold those offices."
Those private lives are under increased calls for transparency — due in part, no doubt, to today's media environment, including interest in "The Crown." While that comes with a cost, "the alternative is the Victorian mannequin," said Watkins, an expert on British culture and society. Those former royals "were so handled — you only see the public body, not the private [person]."
Or, perhaps in the case of the Photoshop fail, neither, really.
Speculation about the matter, for which the princess has accepted blame, has run rampant online. It will likely recede, as most royal rumors do. Yet it points to a more profound problem of trust that may exponentially grow in the emerging artificial-intelligence era.
"There has been no shortage of debate regarding how generated AI imagery is changing the ballgame when it comes to things such as propaganda, politics, entertainment," said Andy Carvin, managing editor of the Atlantic Council's Digital Forensic Research Lab. "Every particular corner of culture is already being impacted by this fundamental question: 'How are we going to cope with the fact that we are increasingly unable to tell what's real and what's synthetic?'
"What I find fascinating about the current situation is that we're not talking about an image that was created out of whole cloth by some AI generator," Carvin said. "It was a photo that a person took and a person edited using common editing touch-up tools that are standard on almost any device that has a lens on it."
Those images that are created out of whole cloth may fray societies — and maybe even democracies.
"All of this is going to be even further complicated, because the reason I think a lot of people worry about the impact of generative imagery is its potential to go viral in volatile circumstances, situations that exacerbate existing partisan divides, fears, hates, whatever they might be," said Carvin. "And every social-media platform has its own way of compressing images to share stuff quickly on their networks, so even if a generative AI company were to install some form of watermark on an image, there's currently no guarantee that by the time it gets spread out through social media that watermark would even be detectable anymore."
And more meaningfully, would it matter?
"The thing that worries me the most is that even if we did have the ability to instantaneously determine whether an image were real or fake, or something in-between, does the public even care anymore?" Carvin rhetorically asked, and then answered, by saying: "Especially subsets of the public that put partisanship and winning above other values such as civil discourse and civic engagement? Because if you're a person who wants your side to win, or you're a politician who's trying to feed red meat to your base, it's not going to be the provenance that determines whether or not that image goes viral. It's how it taps into people's fears."
Carvin compared it to fact-checking, which grows ever more sophisticated but seems ever more Sisyphean.
"You can have the best fact-checking operation in the world, confirming or debunking things in real time. And yet it won't stop incorrect information or intentional disinformation from spreading because there's too much for an appetite for that type of material to take hold and be weaponized."
Which the FBI fears it will — and not just for future politics, but this year's election.
"The U.S. has confronted foreign malign influence threats in the past," FBI Director Christopher Wray said last week at a national-security conference. "But this election cycle, the U.S. will face more adversaries, moving at a faster pace, and enabled by new technology."
And it's not just an election year in America, but a year of elections worldwide, with many already facing foreign meddling attempts from Russia and other rogue state and nonstate actors.
Dis- and misinformation has already become lethal in some societies even without artificial intelligence, said Carvin, whose savvy observations from on-the-ground reporting during the Arab Spring earned him the moniker "the man who tweets revolutions." In India, for instance, Carvin said individuals were lynched because of false child-kidnapping rumors spread on WhatsApp.
"There's already nothing stopping us from creating more moral panics with the information that's already out there," said Carvin. "So what happens when you can dream up almost anything? And custom-design the image to go with whatever goal you might have in order to whip up a population into a frenzy?"
As for the row over the royal photo, the U.K. and even U.S. populations are already in a froth about the altered image, and what it reveals or is trying to conceal. And it's not just the population, but the popular press, which Watkins sees as having "a contradictory set of demands." The press, he said, "is simultaneously asking for transparency and at the same time being ruthless when there is a human foible."
That dynamic may indeed be driving the intense interest. But as loyal subjects (known by Nielsen as viewers) of "The Crown" know, pressure from the press has only accelerated, so it's surprising that the princess — or her royal PR guard — didn't anticipate the response.
But compared to previous clouds over Kensington Palace — and what could be coming from artificial intelligence — this storm will likely pass.
"In some ways," Carvin concluded, "the [photo] controversy felt quaint the moment it had spread." Because, he ominously added, "there are so many more inherently dangerous ways that mis- and disinformation can spread and cause violence and actually get people killed."
Indeed, while "The Crown" may have ended, the crown will endure. But the democracies that replaced monarchies and other inferior forms of government will be increasingly tested by AI.