In 2018 I interviewed Christopher Wylie - known today as the Cambridge Analytica Whistleblower. Chris was one of the chief architects behind the 2016 facebook election scandal.
He walked me through the variety of ways our social media content is curated by algorithms, friendly or nefarious, with the intention of keeping us on the app - through feeding us more of what we want, or more of what triggers us. In 2016 we experienced the power of curated content designed to amplify public division, and it worked dangerously well.
Most people are now aware that our behavior is manipulated in direct proportion to the amount of time we spend on social networks. By feeding us content designed to influence and deepen an opinion, social media feeds have become the biggest catalyst for social conflict.
The platforms we use to communicate and share ideas are constantly analyzing our activities to determine what content will keep us on the platform longer - whether we leave a comment or a like on someone's post, or simply hover over one picture longer than another while we scroll, all of this is measured, processed and entered into an algorithm that determines what we will see next, with the goal of retaining us for a few more minutes.
Every additional second we spend on the platform is an opportunity to place an ad in front of us. Remember - we don’t have to click on the ad for the platform to get paid - if it appears in our feed, a transaction occurs - whether we scroll right past it or not.
I benefit from this as a content creator.
On my YouTube channel, we have 150K subscribers. But despite our large audience, 76% of my views come from non subscribers. Why is this? Our retention is much higher than average. Retention is the length of time people watch our videos vs. our competitors. As a consequence, Google puts my videos into the feeds of millions of non-subscribers because videos with high retention allow more opportunities to place ads.
When I scroll through Twitter, Instagram or Facebook, I am aware I am seeing customized posts that are statistically more likely to keep me on the app.
Have you ever found yourself scrolling through a platform and suddenly realized that 5-10 minutes have passed by? The platforms are dangerously good at what they do.
This isn’t news to many of us - but here is where things might get interesting.
Social media is sophisticated enough to sway public opinions on major issues, and addictive enough to keep many of us coming back numerous times per day. But all of this has been accomplished via handheld screens, smaller than a pint glass.
Everyone is now talking about the Metaverse - full sensory immersion into the digital experience we currently view on our phone. Facebook recently changed their name to META in a bold statement of their future direction.
If our little pocket sized distractions have the ability to direct our behavior, what will be the result of full immersion into a universe created by big tech?
Here is a question that I am grappling with at the moment - Has intuition become a liability?
We love to believe our intuition acts in our best interest - and a millennia ago, no doubt, it did. But although the idea of “trusting our gut” is romantic - what does it actually mean? Trusting our gut means trusting the stimuli that have influenced our subconscious.
There was a time when the stimuli may have been some disturbed brush, or a rotten smell - sending our brain signals of threats or opportunities. But today, our “environment” is not the natural world we evolved in, it is an envelope of algorithmic content - content that is built to create an emotional response by surrounding us with triggering stimuli.
We are in a strange land, and our navigation system is failing.
I am a technology enthusiast, and advocate for the positive benefits of technological adoption. I run my company virtually, I order my groceries on an app, I've used every wearable fitness tracker that exists, and I love it. But like anyone, I want to retain what free will I have.
More importantly, I have three young kids. They will grow up in this new world - whatever it becomes.
I see a few potential outcomes.
One is that as they come of age, native to a digital world, the novelty will not be in virtual experiences - but the opposite. The novelty will be a newfound respect for real human interaction. In this scenario - my generation are the cigarette smokers of the 1970s, increasingly aware of the downside risks, but not convinced enough to put the cigarettes down. My kids will look back at this and shake their heads saying, “How could they not have known this was killing them?”
They may grow up with an inherent understanding of the dangers of integrating their lives virtually - or at least a better understanding of the differentiation between toxic versus productive virtual habits.
But alternatively…
The seductive nature of the Metaverse is too powerful for even the most independent of thinkers. With no counter balancing options, the virtual world and its digital influencers are readily available to pull kids in a negative direction.
There are direct correlations to the adoption of social media and massive spikes in depression among high school kids. What will full immersion do?
Social media is built to trigger reactions out of us. The most powerful tool in our tool belt is to practice saying no to an impulse - this builds resilience to influence.
Two weeks ago in a letter I wrote about avoiding triggers. When I feel triggered, it is followed by the need to react. But I have a choice. I can react, or, I can pause, think and act. I don’t always succeed at this, but when I do I am happier with the result.
This is the skill we can practice and apply to impulse decisions.
We need to practice saying no. We must practice pausing, thinking and acting, instead of reacting.
The Metaverse is coming. That’s not debatable. I am not planning to avoid it. But we are going further down a road that has already proven unmanageable. The car has lost control, but it’s about to slide off the edge.
What do you think?
|
Comments