The first time I noticed it, I was half-asleep and looking for a phone case.
Not even a special one. I wasn’t on some minimalist Japanese leather blog with curated photos of concrete and moss. I was on Google, typing with one thumb in bed:
“iphone case, modern, functional, premium”
I clicked a couple of things. Scrolled. Decided I was too tired to make a decision about drop protection and fake leather. Closed the tab. Opened YouTube, because obviously my brain needed a 27-minute video essay about a game I will never actually play.
And there it was.
Mous. Rubber in slow motion. An iPhone yeeted off a parking garage. That soft, smug “you’ve been thinking about me” energy.
It’s a weird feeling, that little jolt of “hey, how did you get here?”
Like you walked into a grocery store and the cashier said, “Hey Briggs, we put your favorite cereal at eye level for you, and also, we noticed you’ve been looking at phone cases, so aisle seven’s been rearranged in your honor.”
The marketer story we tell ourselves is simple:
There’s some guy in a hoodie named Kyle who “set up targeting.” Kyle “knows” I searched for phone cases, and now I’m in his “retargeting audience,” and he is personally following me around the internet.
That story is wrong, but it’s emotionally satisfying. It’s intimate. It makes the world feel small enough to point at.
The truth is less intimate and somehow more unsettling:
It isn’t really Kyle. It’s the system.
If you strip away the jargon, targeted ads are just this:
- I do things on the internet.
- Those things throw off little bits of data dust.
- That dust gets swept into piles by companies like Google.
- Advertisers pay to blow that dust back into my face, but arranged in the shape of a product.
When I search “phone case,” I’m not sending a letter to Mous. I’m writing a line on a whiteboard that Google controls. Something like:
“This person is probably in the market for: phone cases, protection, iPhone, clumsy human.”
Google doesn’t love me. It doesn’t even really know me. It’s just constantly updating a probability estimate that looks suspiciously like a personality.
From my side, it feels like someone is watching:
I whisper “phone case” into my laptop, and the universe answers with Mous on YouTube.
From the system’s side, it’s more like:
“We’ve seen 92,347,158 people do something that looks like this. When they do, they usually click phone case ads. This new one is probably the same. Roll tape.”
Cold math. Warm video of a phone bouncing off asphalt.
Here’s what makes Google so absurdly ahead of anything a startup could build:
Most companies see one room of your house.
Google sees your whole neighborhood.
A random ecommerce store might know what you put in your cart.
Your bank might know how often you eat at Taco Bell.
Your fitness app might know your steps and your shame.
Google, though? They see:
- What you search when you can’t sleep.
- The YouTube rabbit holes you fall down when you should be working.
- The route you drive to work (hi, Maps).
- The weird thing you Googled that one time and immediately deleted because, “actually never mind.”
They don’t just know that “you looked for a phone case.” They know you’ve been:
- Comparing iPhone models
- Watching “what’s on my iPhone” videos
- Looking up “fix cracked screen cheap”
- Searching for “running shoes for bad knees”
- And then, randomly, “how to tell if your cat trusts you”
All of that becomes signal. Not because a person is reading it, but because the machine is.
A startup can’t compete with this. It’s not a matter of “better targeting features.” It’s a matter of time and mass. Google has twenty-plus years of the world’s collective late-night anxiety stored in its servers. You’re not out-AI-ing that from a coworking space with free cold brew.
So when I ask, “Is this a new strategy some startup could realistically create?” the honest answer is: not really. Not at this scale. Not in the same shape.
You can build tools around the edges. You can specialize. You can care more about privacy, or creative, or specific industries. But the raw “I searched for phone case, and now Mous is haunting my YouTube feed” magic?
That spell was cast a long time ago. We’re just noticing it now.
The weird part, for me, isn’t even that the ads are targeted.
It’s that we’ve collectively lowered the bar to:
“I guess this is just how the internet works.”
There’s this kind of shrug baked into digital life now.
- Of course everything is tracked.
- Of course the apps talk behind our backs.
- Of course searching for anxiety symptoms guarantees you’ll get ads for therapy, supplements, and a journaling app with pastel gradients.
We call it “personalization,” which is such a friendly word. It sounds like engraving. Like someone writing your name in pen on the inside cover of a book.
But it’s not really personal. It’s patterned.
The system doesn’t know that I am a human being trying my best to be a decent dad, occasionally overwhelmed, partially caffeinated, and slightly afraid of dropping expensive electronics on concrete. It knows I live in a cluster with other people whose behavior forms a line that trends toward “will buy a phone case like this.”
The ad doesn’t say, “Hey Briggs, we see your whole life, and we understand you.”
It says, “People who look like the pattern of you clicked this before. Wanna try?”
When you zoom out, this starts to sound like a philosophical problem more than a technical one.
We’ve built a civilization where:
- Our micro-decisions feed a huge invisible machine
- That machine optimizes for a single metric: conversion
- And the reward for feeding it well is… slightly more relevant ads
There’s something tragically small about that.
We poured staggering intelligence, money, and infrastructure into answering:
“What’s the fastest way to show Briggs a phone case he’ll probably buy?”
And don’t get me wrong, I like nice phone cases. I like when things fit my life. I like not scrolling through twelve pages of garbage for the one good option.
But there’s a strange emotional hangover to realizing how much of the internet is this giant funnel. How much of what feels like spontaneous discovery is really just the system saying, “We’ve seen this play out before. You’re up next.”
Then there’s the question I can’t quite shake:
Should any company get to keep that much behavioral history forever?
At some point in the conversation I found myself asking:
“Is there a world where data becomes basically mandatory to declassify after ten years? Fully anonymized. Like public infrastructure?”
Old novels eventually slip into the public domain. Patents expire. Secrets declassify. The default assumption is that knowledge, given enough time, becomes a shared resource.
But my clickstream from ten years ago doesn’t get that treatment. It sits in some corporate archive as private capital. Something that can be mined, aggregated, fed into models. Not as my history. Just as more dust in the piles.
What would it look like if we treated data like we treat public parks or libraries?
I don’t mean raw, creepy databases where you can zoom in on your neighbor’s browsing history from 2013. I mean genuinely anonymized, aggregated, audited-by-actual-adults-with-ethics kinds of datasets. A “data commons” that says:
“You don’t get to hoard twenty years of human behavior and keep it as proprietary dark matter forever. Some of it belongs to the future, not just your quarterly earnings.”
Of course, the second you say that out loud, the complications sprint in:
- Anonymization is hard and often reversible.
- Governments love the idea of more data as much as corporations do.
- “Who controls the commons?” becomes its own power struggle.
- The companies holding the data will fight like hell to keep it locked.
Still, the question lingers. If my attention helped build this system, do I ever get anything back besides better ads?
Part of what makes this all so emotionally confusing is that Google Ads don’t feel like “infrastructure” from the outside.
When you and I experience them, they’re just… little rectangles. A pre-roll video. A sponsored result. Mildly annoying, occasionally helpful, sometimes freakishly well-timed.
From Google’s side, it’s a planetary-scale logistics network.
Underneath that Mous video on YouTube is a real-time auction happening in milliseconds:
- You open the app.
- YouTube sends out a “hey we’ve got a person here, these are the rough parameters” signal to advertisers.
- Advertisers (or, more accurately, their machines) shout back bids:
- “I’ll pay 2.1 cents to show them a video about phone cases.”
- “I’ll pay 3.4 cents to promote my mobile game.”
- “I’ll pay 1.7 cents for my productivity app.”
- Google’s system weighs each bid against predicted outcomes and rules.
- Mous wins. Phone falls off a building in 4K. You wonder if this was destiny.
It’s like if every time you walked past a billboard, the message on it flickered through a hundred options in a fraction of a second before settling on the one most likely to make you stop.
You don’t see the fight. You just see the winner.
No startup is catching up to that overnight. You don’t just whip up “a rival to Google Ads” between your Series A and your espresso machine installation.
And so we’re left living in this reality where a handful of systems decide, moment to moment, which messages get a seat at the table in our brains.
Here’s the part where I admit something unflattering:
Knowing all this doesn’t make me a more principled user. Not really.
I still click “accept all cookies” more than I should.
I still watch the pre-roll because my hands are busy with dishes.
I still get sucked into buying something I didn’t know I wanted.
Some days it bothers me. Other days it feels like background radiation.
But I do think there’s a middle ground between resignation and paranoia. Between “they’re surveilling my soul” and “haha the algorithm got me again.”
For me, that middle ground starts with adjusting the story in my head:
- No, there isn’t a specific marketer obsessed with me personally.
- Yes, there is a web of systems optimizing around my behavior.
- No, I’m not going to outsmart the entire ad ecosystem by sheer willpower.
- Yes, I can at least understand roughly how it works so it feels less mystical.
When you replace “they’re listening to my conversations through my phone” with “my search behavior and watch history are feeding a probabilistic ad engine,” the fear shifts shape. It becomes less ghost story, more city map.
You’re not cursed. You’re just extremely predictable at scale.
The wild part—and maybe the hopeful part—is that this same machinery can do more than sell phone cases.
The exact tools that power Mous’s YouTube haunting also:
- Help small businesses find their first real customers
- Let artists promote albums from their bedrooms instead of begging labels
- Allow nonprofits to reach people who actually care about their cause
- Make it possible for some kid with no connections to get discovered because the algorithm decided their video performs well with people like me
It’s not all sterile capitalism. There’s some accidental grace in there.
But accidental grace is still not the same as intention. The system doesn’t wake up in the morning and decide to make the world more just. It wakes up and maximizes click-through and return on ad spend. If good things happen on the way, that’s noise that happens to align with signal.
That’s why I keep circling back to this idea of “data aging out of private hands.”
Not as a perfect solution, but as a question worth holding in public.
If the pipes are now permanent—if we’ve decided the world we live in will always have real-time auctions for our attention running underneath—then at some point we have to ask who owns the water.
I don’t have a clean thesis to end this with. I’m suspicious of clean theses anyway.
What I have is a messy emotional pile: I search for a phone case, I get an ad, and I feel vaguely watched. I realize I’m not special—just another node in an endless pattern—and I feel both relieved and a little diminished.
That is the real story of modern tech. It isn't “evil” and it isn't “salvation.” It is a constant, low-level negotiation between convenience and dignity. It is vast systems that are impressive and kind of embarrassing, interacting with ordinary humans who are just trying to buy a phone case without tripping over the infrastructure of the entire digital economy.
Targeted ads aren't magic; they’re just pattern-matching on top of massive behavioral history. But somewhere between the cold math of the algorithm and the warm glow of the screen, there is a question we haven't really answered yet:
If our lives are the raw material for these systems, what do we get to own back?
Discussion