Someone asked me the other day: how do you stay relevant in the next wave of technology? How do you make sure you’re not left behind?
I’ve been thinking about this a lot. The conventional answer is something like “keep learning, stay adaptable, upskill.” And sure, that’s not wrong. But it’s also not very interesting. It’s the kind of advice that sounds wise but doesn’t actually tell you where to look.
Here’s what I actually believe: the next wave will not come from where everyone is already looking.
The Infrastructure Illusion
Right now, the AI world is fixated on foundation models and GPUs. The base model groups and NVIDIA are the center of gravity. They’re the ones making headlines, raising billions, and attracting the best talent. And yes, they are doing extraordinary work.
But if you zoom out and look at the history of computing, you’ll notice a pattern. The infrastructure layer always stabilizes. The people building the rails are critically important, but they are not, historically, the ones who create the most surprising value. AT&T built the telephone network. They did not invent the internet. IBM built mainframes. They did not invent the personal computer. The GPU clusters of today are the telephone exchanges of the 1960s — essential, but not where the real story is going to unfold.
Computer science, at its core, is about building another layer of abstraction. Every generation of infrastructure eventually becomes a commodity, and the explosion of creativity happens one layer above. Jensen Huang said something I agree with: every CEO who is laying off people because of AI is fundamentally suffering from a lack of imagination. The infrastructure is a tool. What you build with it is the actual game.
The Pattern of the Margins
Now here’s the part that I find genuinely fascinating. When you look at where the most transformative innovations in computing actually came from, it’s almost never the mainstream. It’s almost always the edges.
The open source movement didn’t come from corporate R&D labs. It came from Berkeley, from the same cultural ecosystem that produced the Free Speech Movement, the counterculture, and yes, LSD-fueled parties where people talked about consciousness and liberation. Richard Stallman wasn’t a businessman optimizing for market share. He was a guy at MIT who got angry that he couldn’t fix a printer driver because the source code was locked. That anger became GNU. That became the ethical foundation for Linux, which today runs virtually every server on the internet.
Linux itself was a hobby project. Linus Torvalds was a 21-year-old student in Helsinki who posted on a mailing list: “I’m doing a (free) operating system (just a hobby, won’t be big and professional like gnu).” He was wrong about the “won’t be big” part. He was right about the “hobby” part — that’s exactly the energy that made it work.
Apple was born in a garage, in the orbit of the Homebrew Computer Club — a gathering of hobbyists, amateur radio enthusiasts, and counterculture technologists whose shared conviction was that computers should not be monopolized by large corporations. The entire personal computer revolution was ignited by people the mainstream industry considered irrelevant.
The World Wide Web was a side project. Tim Berners-Lee built it at CERN to solve a mundane problem — sharing documents between physicists. CERN had no interest in commercializing it. The most transformative communication technology since the printing press was, from an institutional perspective, a footnote.
Bitcoin emerged from cypherpunk mailing lists and crypto-anarchist philosophy. Satoshi Nakamoto wasn’t a fintech entrepreneur pitching VCs. The entire premise of cryptocurrency was born from a radical ideological position about the nature of money and state power.
The Transformer architecture — the thing that made the current AI wave possible — was introduced in “Attention Is All You Need” at a time when the mainstream NLP community was still invested in LSTMs and CNNs. Most of the paper’s authors left Google shortly after publication. A single underestimated paper triggered a paradigm shift worth trillions.
Minecraft, the best-selling video game in human history, was built by one person in his spare time, inspired by a game almost nobody had heard of.
Napster, written by a college kid in a dorm room, destroyed and rebuilt the entire music industry’s distribution model.
The MIT Tech Model Railroad Club — a group of people obsessed with the electrical circuits of model trains — invented the word “hack” and gave birth to hacker culture. The entire ethos of creative, playful, unauthorized exploration of systems came from train enthusiasts.
And Neuro-sama, the AI VTuber I’ve been deeply fascinated by — she was born from her creator Vedal’s simple thought: “I do not want to be an engineer.” Not a business plan. Not a market analysis. A personal rejection of the default path, followed by the stubborn act of building something he actually wanted to exist.
Why the Edges?
The pattern is so consistent that it demands an explanation. Why do the edges produce disproportionate innovation?
I think the answer is structural. When the mainstream pays attention to something, capital flows in. When capital flows in, competition intensifies. When competition intensifies, everyone converges on the same strategies, the same metrics, the same optimization targets. The space gets crowded. Alpha gets arbitraged away. You end up with a hundred companies doing roughly the same thing, differentiated only by execution speed and funding size.
The edges are different. Nobody is competing there because nobody thinks there’s anything worth competing for. There are no KPIs, no “competitive landscape analyses,” no “TAM calculations.” There’s just someone with an obsession and the freedom to follow it. And that freedom — the absence of external pressure to optimize for legible metrics — is precisely what allows genuinely new things to emerge.
This is, incidentally, very close to what Clayton Christensen described as disruptive innovation. Disruption never comes from market leaders. It comes from the low end, the overlooked, the dismissed. But I think the phenomenon is even broader than Christensen’s framework suggests. It’s not just about market positioning. It’s about cultural positioning. The most important innovations come from people who are not just in a different market segment, but in a different cultural universe entirely.
What This Means for You
If you’re trying to figure out where to position yourself for the next wave, stop looking at where everyone else is looking. The foundation model race is fascinating to watch, but if you’re not already at DeepMind or OpenAI, trying to compete on that axis is probably not your best move.
Instead, look for the edges. Look for the places where genuine passion exists but mainstream attention doesn’t. Look for the weird, the niche, the culturally specific, the things that make VCs shrug because they can’t model the TAM on a spreadsheet.
For me, that edge is the intersection of AI and entertainment — AI VTubers, voice synthesis, character AI, the merger of generative models with anime culture and digital performance. The big companies are pouring billions into chatbots and coding assistants. Almost nobody with real technical depth is asking: what does it mean to create an AI character that people genuinely love? What does it take to give a language model a soul?
That’s not because the question doesn’t matter. It’s because the people who control capital don’t have the cultural vocabulary to even articulate it. And that gap — between what matters and what gets funded — is exactly where the opportunity lives.
The tree that grows in the place nobody thought to plant one is the tree that becomes the forest.
If you found this resonant, I occasionally write about the intersection of AI, cyberculture, and entertainment. You can find me on GitHub or Twitter.