There's Something Happening Here...
"But what it is ain't exactly clear...." Buffalo Springfield
Hello friends. It’s Monday, and weirdly a holiday that doesn’t really feel like a holiday.
Something Big Is Happening. But About What, Exactly?
Sooooo… if you spent any time online last week, you were told - confidently - that there was something “big” happening. Again.
A viral post by Matt Shumer declared that AI has crossed an invisible threshold, implying we should all probably start packing up our desks. I even jumped into the fray myself, mostly to suggest that maybe we shouldn’t surrender our agency to the machine just yet.
But that was just the appetizer.
Depending on which algorithmic rabbit hole you fell down, music is officially ruined, software companies are on the extinction list, and the AI-generated ads at the Super Bowl were soulless nightmare fuel that proved creativity is dead.
Meanwhile, in the “non-AI” timeline (which is getting harder to find), Barack Obama is clarifying jokes about aliens, and the polite citizens of Canada have been accused of cheating at Olympic curling. (If the Canadians are dropping F bombs we really are in the end times).
Everyone agrees “something big is happening.” No one agrees on what that something is.
Some are arguing economics - efficiency curves and market inevitability. Some are arguing labor - jobs, dignity, and authorship. Some are arguing art - meaning, soul, and why the Super Bowl commercials looked like they were directed by a hallucinating toaster.
And we’re doing it all as if it’s one unified debate.
It really isn’t.
What feels like chaos isn’t just technological acceleration. It’s conversational misalignment. We are debating AI as if it’s a single challenge with a single right answer - when in reality, we’re arguing entirely different premises at the same time, and to which there will be no right answer.
Which is why it feels less like a disagreement… and more like a crowded room where everyone is shouting in a different language.
So, yeah. What shall we do with all of that…
Zoom Lens: The Fulcrum Of Taste
Why We Are All Talking Past Each Other About AI
I’ve been fascinated - bordering on unhealthy obsession - with the disconnect in our professional debates. Whether you’re a content creator, a communicator, or an artist, it feels like we’re trapped in a Zoom call where everyone has a bad connection and is saying “can you hear me now?”
And I’ll be honest: I don’t have the Rosetta Stone for this. Half the time, I’m just as dizzy as the rest of the industry, trying to figure out if I should be learning Python or poetry.
But here is what I have noticed: If you actually tune out the volume and tune into the substance, you realize something: none of these conversations are actually about the same thing.
We are arguing about AI as if it is a single, monolithic debate. It isn’t.
It is at least four different debates happening simultaneously. And the reason we feel so confounded - and believe me, I am the captain of the S.S. Confounded right now - is that we are yelling both within and across category lines without even realizing the boundaries exist.
Camp 1: The Inevitabilists (The Battle for Output)
In one corner, we have the Inevitabilists. Their position is brutal and linear: The machine produces faster and cheaper; therefore, it wins. This is the business/market efficiency argument - more songs, more code, more copy, more ads.
From this vantage point, volume is progress. If a model can generate 1,000 songs in an hour and a website to sell them in 13 minutes, it has “created” value. This camp treats creativity like coal mining. Extraction is the metric. Tonnage and speed matters.
But inside this camp, there is distinction - if not disagreement - brewing over Quality vs. Viability.
On one side, you have the Optimizers: “If the AI solution is 95% as good as the human but costs zero, the efficiency of markets will reign.” They are betting on (or resigning themselves to) the “Good Enough” economy.
On the other side, you have the Technocrats: “If you think that code/copy is good, you must be a shitty coder/writer.” They argue that the AI output is still (or counting on it not being) derivative trash, and anyone relying on it is accruing creative debt they can’t pay back.
But notice the fatal assumption they both share: They believe Output equals Value. They are confusing the act of production with the impact of the product.
Camp 2: The Laborers (The Battle for Method)
In the other corner, we have the Laborers. Their position sounds morally sturdier: The effort validates the art.
But here, too, there is a massive fracture. They are fighting over Purism vs. Adaptation.
On the Purist side, you have the “47-Year Guitarist.” Their argument is moral: “I studied for decades. It shouldn’t be allowed for Spotify to replace me with a prompt. That’s cheating.” They believe the dignity lives in the human struggle.
On the Adaptive side, you have the “Prompt Engineers.” Their argument is pragmatic: “You better get good at AI, because operating the machine IS the new skill.” They believe the dignity lives in mastering the new tool.
You’d think like the inevitablists, that these two groups are enemies. But they are actually arguing the same premise: That value is defined by the Method. One side worships the callus on the finger; the other worships the syntax of the prompt.
They both believe that if you master the exertion (physical or digital), you create the value.
The Missing Debate: The Rise of the Curator
These two camps talk past each other and themselves because one is arguing mechanics (Efficiency) and the other is arguing tools (Labor). Neither is arguing metaphysics.
Because beneath the noise is a much older, quieter question: Where is value actually created?
For centuries, we assumed value was created at the Source - by the exertion of making. In the creation of communication (writing, music, imagery, design), we often placed greater value based on the exertion. We put a higher price tag on the tortured artist.
Think of the premium the culture places on films like Apocalypse Now because of its infamous, tortured shoot. Or Fusion Jazz, which is often valued more because it is so physically demanding to play. Or novels like Joyce’s Ulysses or Pynchon’s Gravity’s Rainbow that earn their status partly from the perception of how hard and complex they were to write.
But what happens in an age of infinite, instant generation? The Source is cheap. The “making” is trivial. The suffering is gone.
Value migrates downstream. It moves to the Filter.
This is where everyone, including the guitarist with 47 years of experience, or the veteran copywriter, actually wins - but not for the reason they think.
Yes, the machine can now move the digital “fingers” on the fretboard. The motor skill is commoditized. But the experience, the wisdom, and the discernment are not.
Those 47 years of practice weren’t just training your hands to move; they were training your mind to hear. They were building a library of nuance, emotion, and shared human context.
And this is where Trust and Value enters the equation.
In a world where AI can generate 1,000 songs in a minute, the world will inevitably be filled with millions of mediocre ones. The scarcity isn’t content; it’s reliability.
Trust and value now accrues to the Creator who is also a ruthless Curator.
It doesn’t matter if the song was born from fingers on a fretboard, a complex prompt, or a hybrid of both. The source is irrelevant to the consumer of the idea.
The effort is distinct from the value.
Should the artist place a higher premium on the creation because it was forged in terrible sadness, years of physical exertion, or excruciating pain? Sure. That is their prerogative. That is the Cost of their creation.
But the moment they ask the audience to pay for that effort, it ceases to be a production cost and becomes a Brand Attribute. It becomes Lore.
It is a narrative layer - marketing, effectively - that might sway the audience if they know the backstory.
What matters now is that the consumer (whoever that is) trusts you - the artist, the strategist, the leader - to stand in front of that wall of infinite noise and say: “These 999 may be noise. But this one... this one tells the truth.”
We trust you because you only present the truth. Reliably.
That is the ultimate argument for studying your craft first, and then studying how AI helps. You don’t study to only learn how to move your hands; you study to learn what “good” actually looks like. You study so that when the machine hands you a thousand options, you have the wisdom to choose the one that matters.
From Operator to Director
This new value is why the “go get good at AI” advice, without anything else accompanying it, rings so hollow right now - even to people who can’t quite articulate why.
If you’re new or middling at a particular craft, it feels like you’re simply too late. It’s depressing. It feels like you should just cede your potential to the machine. If you’re a veteran, it feels like you’re being told there’s a “cheat code” that renders your decades of experience obsolete.
In both cases, it treats the human as an Operator - someone who just turns the crank.
But we are moving from a world of Operators to a world of Directors.
The Operator asks: “How do I make more of this, faster?” The Director asks: “Of all the things we could make, which one should we make?”
If AI becomes the world’s most prolific crank-turner, it doesn’t eliminate the need for direction; it amplifies it. Because without taste - without the ability to distinguish between technical accuracy and meaningful insight - infinite output is just infinite noise.
I think the reason it feels like everyone is missing the point is that some are defending hands (Labor) and some are celebrating horsepower (Output), but very few are defending our Eye (Judgment).
In a world where the Source is infinite, the Filter becomes the seat of power. Not as a gatekeeper, but as a guide.
And for marketing leaders, writers, and thinkers, this is the pressure point. If you don’t know what “good” looks like - if you haven’t developed the deep, hard-earned taste to curate the signal from the noise - then you are just adding to the pile.
The machine can fill the page. It can fill the screen. It can fill the silence. But only you can make it matter.
WIDE ANGLE LENS
Let’s Zoom out and check out what interesting things are catching my eye in marketing and advertising this week.
🎬 Disney Bets on Experience (Not Just IP)
Harvard Business Review profiles Disney’s incoming CEO Josh D’Amaro as the poster child for “experience intelligence.” Translation: It’s not enough to just own the Wookiee; you have to know how to make people feel something when they meet the Wookiee.
Read it at HBR →
🤖 The Mothership Calls OpenClaw Home
OpenClaw’s creator just joined OpenAI. Because of course he did. Another tool folds into the bigger machine. Another independent layer gets assimilated. The headline reads like acceleration, but the interesting part isn’t the consolidation. It’s the quiet realization underneath: We are building increasingly capable engines, but we are running out of people who know where to drive them.
Read more at TechCrunch →
🎥 Apple Invents Television (Again)
Apple is launching video podcasts, taking aim at YouTube and ensuring that creators now have to worry about lighting and audio quality. More cameras. More feeds. More surfaces to fill. The expansion makes strategic sense, but it highlights the bottleneck of our moment. We don’t have a distribution problem; we have an attention problem. The challenge isn’t getting the video to the screen. It’s making something worth watching twice.
Read it at TechBuzz →
LENS FLARE
What I’m reading right now
📘 The End of Competitive Advantage — Rita McGrath
Good lord it’s hard to admit that one of my favorite business books of all time is now 10 years old. <sigh> Anyway I revisited The End of Competitive Advantage recently, and (as usual) it reads more like it should have come out this month.
McGrath’s core argument is simple: sustainable competitive advantage is mostly a myth. In fast-moving markets, advantages are transient. They appear. They erode. They migrate. The winners aren’t those who defend moats forever - they’re the ones who learn how to move.
Which feels uncomfortably relevant to our AI moment. AI doesn’t end competitive advantage. It compresses it. It shortens the half-life of being “better.”
🍷 Lens Cap - The Finishing Notes…
So, what did we learn this week?
In just seven days, we’ve been told that AI has rewritten music, eaten the software industry, turned the Super Bowl into a hallucination, and - just for variety - that aliens might be real and Canadians might be cheaters. (Honestly, I’m most worried about the Canadians. They’re usually so polite, but I also get why they might be a bit prickly right about now.)
Anyway, the volume of the debate makes it feel like we’ve crossed a line we can’t uncross. And maybe we have. But maybe the real shift isn’t that machines can finally produce more than we can.
Maybe the shift is realizing that production was never the point.
It leaves the Hands (the labor). It leaves the Horsepower (the speed). It moves to the Eye. You know that one that belongs to the beholder…
The one who decides what matters. The one who discards the noise. The one who stands in front of a tsunami of “good enough” and says, “This. Not that.”
Because in a world where everything can be created… Strategy is what we choose to stand behind.
It’s Your Story… Curate It Well…




