Customer Futures Perspective: welcome to the Zoom call... now please prove it's you
AI is going to unravel digital trust for us all. Maybe digital wallets will help us navigate the future chaos.
Hi everyone, thanks for coming back to Customer Futures. Each week I unpack the fundamental shifts around digital customer relationships, personal data and customer engagement.
This is the PERSPECTIVE edition, a weekly view on the future of digital customer relationships. Do let me know what you think - just hit reply!
If you’re reading this and haven’t yet signed up, why not join hundreds of executives, entrepreneurs, designers, regulators and other digital leaders by clicking below. To the regular subscribers, thank you.
PERSPECTIVE
Welcome to the Zoom call... now please prove it's you
Within 12 months we won’t be able to tell real video from fake. Including our own conversations.
The latest text-to-video AI platforms are out of this world. You should check out the latest video snippets from Nvidia. They are simply extraordinary, and built from prompts like:
"a storm trooper vacuuming a beach"
"sunset time lapse at the beach with moving clouds and colors in the sky, 4k, high resolution"
"an astronaut feeding ducks on a sunny afternoon, reflection from the water"
This is all going to be incredibly disruptive to the creative industry. We’ll look back on March 2023 as the turning point. On 2022 as the good old times.
Many worry about the coming job apocalypse, especially for creatives. But as they say, “it won’t be AI taking your job… it will be someone using AI.” There will be as many new roles created as displaced (think about today’s SEO engineer or EV battery designer).
Yes, these new AI tools will result in an explosion of new artistic and creative possibilities.
But there’s a real problem coming.
When it comes to real life, AI is going to unravel digital trust for us all.
Let me give you three simple examples:
1. Trump’s arrest
Earlier this month I was on the BBC website following Donald Trump’s arrest in New York. And it hit me like a ton of bricks: I couldn’t tell if the photos of Trump arriving at the NYC courthouse were real or not.
A week or so before, a number of deliberately misleading photos had been created on Midjourney and shared widely on the web. Showing Trump in a scuffle with the NYPD while being arrested. And of the Pope in a massive puffy jacket.
It was all marked as fake. Published as fake. Deliberately fake. They were showing off what’s possible rather than making a political point.
And most of us watched, smiled and moved on.
But when it came to the actual photos taken of Trump on the 4th April, they looked… the same.
I mean, they could have easily been created on Midjourney too. It was a big wake-up call.
Why would I trust those ‘real’ photos of Trump any more than those circulated before the arrest even happened?
In this case, our digital trust comes from the publisher, the channel. Organisations like the BBC, the Associated Press and Al Jazeera.
But is that enough any more? Can we only rely on an ‘AI-Generated Fake Image’ strap across the front?
Source: the Verge
2. Zoom call
I was on a Zoom call recently, having a short intro conversation with a new personal contact. But just after the call finished, it struck me: How did I know that it was a real person and not a digital fake?
They seemed pretty real. A photo-realistic digital person moving and speaking, using a voice I didn’t yet recognise, with a standard Zoom background behind them.
This isn’t about a conspiracy theory. I’m not suggesting it was all a digital trick. But it made me question how I would know in the future.
What clues can we rely on?
My contact and I agreed the Zoom number in advance… and we both showed up in the right place at the right time. They said the right things. The issue was that I didn’t have any previous context or signals to spot if something was wrong.
Given how fast AI is moving, won’t all remote calls be like this, fakable? And really soon?
The early internet wasn’t designed for digital trust… and neither are our collaboration tools today.
Zoom will confirm that no one else is listening to your call. That only those invited can join. But how do I know it is really the right human? Once again, we are relying on - and trusting - the digital channel.
Does that make Zoom - and Teams and the other platforms - our new digital identity providers? In the same way that today our email platforms have become our de-facto identity providers?
There’s definitely enough training data already out there for someone to be able to spoof me on a video call.
Maybe it’s fine. The latest AI video tools aren’t that advanced, are they?
3. Police raid
Then this week I came across this bodycam video of an armed police officer entering a disused building and then chasing a suspect.
Worth watching now, before reading on.
Putting to one side how terrifying the video actually is, the point is that it is fake. Well, it’s kind of real. It’s part of the new video game ‘Unrecord’, distributed by Steam. They have published this trailer to show how realistic things are getting and to generate excitement.
The video triggered a lot of people to shout ‘scam’. But the producers insist that not only is it real, but the video is rendered in real time. They call it ‘un-real’.
Piecing it together
What happens when, inevitably soon, some AI team bundles these things together: fake photos with plausible content (Trump) + personal digital conversations (Zoom) + fake scenarios that look and feel real (Unrecord).
Here’s what I’m saying:
As of right now, it’s really only the channel, the publisher that’s bringing trust to our digital experiences.
There’s already enough training data out there to fake any one of us at will. We’re only one online recording away from a digital doppelganger.
This will soon all be available to produce in real-time.
Yes, text-to-video is exciting. And yes it’s going to create a lot of new value, new jobs and disruption.
But it’s also going to usher in a collapse of digital trust.
We won’t be able to believe what we’re seeing or hearing when online.
It’s a fun demo today. But share-price affecting tomorrow. It’s a college project today. But bank fraud tomorrow. It’s research today. But voter influence tomorrow.
We are all going to need new digital tools and digital infrastructure to confirm who - and what - we are dealing with.
But here’s the catch: it’s going to take time to build it. My sense is that the AI video teams are already one paragraph ahead of the reader.
Maybe we start using a digital wallet to authenticate ourselves at the start of each Zoom call? Or maybe I can begin sending my pre-verified, personalised robo-Jamie to attend calls so I don’t have to?
New digital wallets feel like a sensible way out of the inevitable digital chaos. To prove who we are, in any context or channel. Even anonymously when needed - many interactions don’t need to know who we are, but just that we are a real human.
One to think about when you are next getting in arrested in New York, on a Zoom call with a stranger, or playing a real-time online video game.
You’re going to have to trust me on this one.
Thanks for reading this week’s edition! If you’ve enjoyed it, and want to learn more about the future of digital customer relationships, personal data and digital engagement, then why not subscribe: