More People Who Know Less Can Now Make More

What Past Disruption in Creative Industries Teach Us About the Future Impact of AI

Let’s begin with an obvious statement: AI is radically transforming the technology industry at a rapid pace. Tools like Replit, GPT-4, and Copilot are letting people — non-technical people — go from idea to working prototype to published app in hours without writing a single line of code.

This paradigm shift, though significant, isn't unprecedented. While AI's scope of impact may be far broader, the underlying trend has played out repeatedly in recent history. Creative industries have already experienced similar transformations: tools became cheaper and more accessible, gatekeepers lost control, and creative output surged among those previously blocked by technical skill gaps or access barriers.

But if we stop at “more people who know less can now make more stuff,” we’re missing the real story.

The deeper, more useful question is: what actually happened in those industries after the gates opened? Who adapted and thrived? Who didn’t? And what does that tell us about where software development is heading?

Let’s take a closer look.

Music’s Digital Revolution Shifted the Bottleneck

When people talk about how music got democratized in the 2000’s, they usually point to the same surface-level story: home recording software got cheaper, distribution moved online, and suddenly anyone with a laptop and a USB mic could make a hit. That’s all true, but it misses what actually changed beneath the surface.

The big shift wasn’t just that it got easier to make music. It was that removing this bottleneck simply revealed another bottleneck downstream. In the old model, recording was expensive, distribution was tightly controlled, and success meant convincing gatekeepers (label executives, studio owners, radio programmers, etc.) to let you in. When those barriers fell, creation exploded, but attention didn’t. What changed was the locus of control. Labels lost their grip on production, but the rise of streaming platforms like Spotify simply moved the bottleneck downstream to discovery. Today, algorithms decide what gets surfaced and what disappears. A song might be beautifully produced, but if it doesn’t hit the right engagement profile, it won't make the playlist, and most people won’t hear it.

At the same time, the economics of music shifted dramatically. In the physical media era, selling an album brought in real revenue per unit. Now, an artist earns a fraction of a cent per stream. In other words, the song became the product’s wrapper, not the product itself. For most musicians, recordings are no longer the thing that makes money. They’re a marketing channel for live shows, merch, sponsorships, and fan engagement. Music became abundant and, in a purely economic sense, cheap.

If we translate that into software terms, the implications are stark. As AI makes it trivial to produce functional apps, the code itself becomes less economically valuable. The moat doesn’t come from having built something, but rather it comes from having built something that people know about, trust, and keep coming back to. Just as music became an attention game, so will software. Distribution and discoverability will define outcomes. If your app isn’t surfaced by the right store, marketplace, or algorithm, it might as well not exist. And just like streaming reduced the value of an individual track, AI will reduce the value of a standalone feature. Your economic engine needs to come from everything around the code: the brand, the ecosystem, the ongoing relationship with users.

That’s the deeper lesson from music. When creation gets cheap, it’s not the creators who win by default. It’s the ones who adapt to a new supply-and-demand dynamic, where what’s scarce isn’t code, but time, trust, and attention.

Publishing Redefined Quality

The disruption of publishing in the 2000’s-2010’s is often romanticized as a story of empowerment. Blogs took off. Social media gave everyone a voice. Traditional gatekeepers lost their stranglehold on who gets to speak and who gets heard. While that is all true, the shift wasn’t just about inclusion; it was also about information overload, fragmentation, and a profound change in how people evaluate credibility.

When publishing was expensive and slow, it was also filtered. An editor had to sign off, and there was usually a layer of fact-checking and revision. This didn’t guarantee truth, but it imposed quality-centric friction. There were costs to publishing something, and so publication implied (at least in theory) some threshold of quality. When platforms like Blogger, Tumblr, Medium, and Twitter emerged, those filters disappeared. Anyone could publish anything, instantly. The upside was clear: more perspectives, faster cycles, and a global readership. The downside was that the authority of “published work” collapsed.

Readers had to become their own editors. They had to learn to assess credibility, parse intent, and separate signal from noise, often without any training or tools. At the same time, the economics of media shifted away from accuracy and toward engagement. Clicks, shares, and outrage became the currency of the realm. And even as creation decentralized, discovery re-centralized, this time around a handful of tech platforms. Google, Facebook, Twitter, and YouTube became the new curators, but their incentives were tuned for virality, not nuance.

For software, this story feels eerily familiar. AI may enable more people to build apps than ever before, but that doesn’t mean the best ideas will win. If anything, the flood of new software will make it harder to tell what’s good. The key differentiators won’t be technical superiority. They’ll be trust signals: reviews, community engagement, (co-)branding, customer service, perceived professionalism. And, crucially, discoverability, whether through app stores, integrations, or even AI agents that begin recommending software on behalf of users.

We’re also likely to see the same breakdown of roles. Just as bloggers had to become their own editors, marketers, and publishers, solo software creators will have to own not just the product, but the presentation. In-house teams will need to think more like media outfits, telling coherent stories about what they’ve built and why it matters. And they’ll need to decide where they stand on a shifting spectrum of quality versus speed, credibility versus flash, because there won’t be any default filters doing that work for them.

Photography Inherently Changed Usage

The shift in photography wasn’t just about access. Yes, smartphones put cameras into everyone’s hands. Yes, the cost of taking a photo dropped to zero. But the more profound change was in the function of photography itself.

Photos used to be artifacts. You took them to preserve memories. You printed them, framed them, maybe mailed them to relatives. Because film was scarce and developing it cost money, people took pictures sparingly. There was a kind of intentionality built into the medium.

Digital photography, and later smartphones, rewired that relationship. Photography stopped being about preservation and became a form of communication. We no longer take pictures simply to remember. We take them to express. An Instagram story isn’t documentation, it’s dialogue. A meme isn’t a picture, it’s a slice of the zeitgeist dripping with implied meaning.

That shift brought cultural changes too. Aesthetic literacy rose across the board. People learned composition, filtering, and basic visual storytelling through pure exposure and repetition. The average person may not know the term “leading lines,” but they know when a photo looks good, or at least when it conforms with the popular definition of good. Meanwhile, professional photographers didn’t disappear. But their role shifted away from being the only ones who could take technically good photos, and toward those who could create something emotionally resonant or deeply intentional in a sea of decent snapshots.

The lesson for software is twofold. First, just like photos, software might become ambient. Instead of using a handful of big applications every day, users might increasingly interact with dozens of tiny tools, many of them ephemeral, single-purpose, even disposable. If AI makes building apps as easy as snapping a photo, it stands to reason that many apps will be purposefully created, purposefully used once, and then discarded. That isn’t a flaw. It’s a new mode of software interaction.

Second, as the baseline for software quality rises, differentiation will come from feel. Not raw capability, but tone, polish, emotional resonance. A dozen apps might offer the same functionality. The one that wins will be the one that feels like it was built with care. That’s what happens when tools become universal. Craft becomes the differentiator.

And just like in photography, curation becomes key. If everyone can build, then someone has to filter. That’s an opportunity for new product roles: not just engineers or PMs, but curators, editors, and community builders who help surface what matters.

What It All Means for Software: More Than Just More

These stories don’t map to software one-to-one, but the structural echoes are too loud to ignore. In every case, democratization brought more creation, and with it, more noise, more fragmentation, more pressure on the systems of distribution and filtering. But it also unlocked new forms of expression, new voices, and new business models.

If you’re leading a product team in this environment, the challenge isn’t just to build faster. It’s to build smarter. To understand where value shifts when the cost of creation drops to zero. To see how roles evolve, how platforms consolidate power, and how users recalibrate their expectations.

Because when software becomes abundant, code stops being the constraint. Imagination, empathy, and execution take its place.

What Product Teams Should Actually Do About It

If we take the lessons from music, publishing, and photography seriously, the path forward for software teams becomes clearer, but also more complicated. This isn’t just a question of “should we use AI tools?” That’s table stakes. The real questions are about how you structure your teams, how you define quality, how you prioritize, and how you lead in a world where execution is easier, but success is harder.

Let’s start with pace. One of the most obvious impacts of AI-assisted development is speed. Teams can now prototype entire applications in a day. This changes the nature of iteration. It doesn’t just mean shipping faster. It means reframing what an “MVP” even is. Product teams will need to shift from planning-oriented models to exploratory ones. Think of it like switching from blueprinting a building to sketching on paper. You don’t have to be completely right at the start, just a little bit right. More important is your ability to be responsive and to move quickly when new feedback or ideas surface.

But speed without guardrails is chaos. So while you want to open the throttle on prototyping, you also need tighter loops on quality control. That means updating your development process to make room for AI without sacrificing integrity. You’ll need clear practices for reviewing AI-generated code. You’ll need automated test suites that catch the kinds of regressions and vulnerabilities AI might miss. And you’ll need to invest in documentation and code hygiene, because even if an AI wrote it, your team will have to live with it and your company will have to scale on it.

Then there’s the question of who builds what. As AI lowers the bar for implementation, non-engineers are going to start shipping more. Product managers, designers, even domain experts from other departments will increasingly be able to create working tools. Some companies will resist this. They’ll worry about quality, security, or governance. And those concerns are valid. But the risk of letting it happen unsupervised is far lower than the risk of trying to stop it altogether. The history of software is full of shadow IT and unsanctioned tools, not because people wanted to go rogue, but because official channels were too slow or cumbersome. AI is going to accelerate that dynamic. The best thing you can do is acknowledge it, support it, and wrap it in structure.

That means providing safe environments for experimentation: sandboxes, shared AI tooling, best practice guides. It also means thinking about how to fold these non-engineer developers into your larger strategy. Maybe your PMs now own rough prototyping. Maybe your designers start implementing frontend components directly. Maybe your engineers become more like editors and architects, responsible for refinement, scalability, and integration. The point isn’t to defend turf. It’s to build the shortest path between an idea and something real, and the systems to support and scale that process.

Hiring will shift too. The emphasis will move away from rote technical ability and toward system-level and critical thinking. You’ll still need engineers who understand performance, security, and architecture. But they’ll also need to be comfortable editing, auditing, and improving code they didn’t write. Prompt design, AI debugging, and orchestration will become valuable skills, especially as models get more capable, but also more unpredictable. The best developers won’t be the ones who write the most code. They’ll be the ones who ask the best questions and design the best systems around the code.

Meanwhile, the kinds of people you want to bring onto product teams may start to look different. Writers, educators, curators and other people who are great at making information digestible and systems understandable could become increasingly valuable, especially as the flood of AI-generated tools creates more noise than clarity. In some teams, the most important person might not be the one who builds the tool, but the one who figures out how to make it usable, discoverable, and trusted.

And trust is worth pausing on. As software becomes easier to build, it also becomes harder to trust. There will be more apps, more plugins, more scripts. Many will be buggy, insecure, or poorly maintained. Users will get burned. And when that happens, they’ll default to the same heuristics they use in other crowded markets: brand, design, community, and reliability. This means your code can’t speak for itself anymore. You need to think about everything that surrounds it: onboarding, documentation, support, responsiveness, community engagement. If you’ve ever wondered why some open-source tools take off and others don’t, even when the code quality is similar, this is why.

So product leaders need to think beyond just “what are we building?” They need to ask: Why does this matter? Who will care? What happens if it breaks? What does success look like after launch, not just at release?

They also need to ask how their team is learning. Because this isn’t a one-time disruption. The tools will keep changing. The capabilities will keep growing. The organizations that thrive will be the ones that treat AI like a dynamic force, not a static solution. That means creating a culture where experimentation is normal, where people share what they’ve learned, and where adaptation is part of the job. You don’t need an AI task force. You need teams that know how to notice when the rules change and respond accordingly.

None of this means throwing out what already works. Good design still matters. Sound architecture still matters. Secure code, reliable systems, and thoughtful UX arguably matter more than ever. What changes is the cost of getting to “MVP,” and the expectation of what it means to get to “done.” In that sense, the AI shift isn’t about replacing developers or redesigning product from scratch. It’s about changing the cadence and expanding who participates.

The job of the product leader isn’t to protect the old model. It’s to help their team succeed in the new one. That means getting comfortable with ambiguity, investing in new kinds of talent, and never forgetting that the thing AI can’t do, and will never do (come at me, Sam Altman), is care. About the problem. About the user. About the team. That’s still on you.

Where This Leaves Us

We’ve seen this trend before. Not with code, but with songs, books, photos, all the creative tools that used to belong only to specialists. When the tools got easier and cheaper, the number of people creating things exploded. But the value didn’t vanish, it just shifted. In every case, the shift was about more than technology. It was about power, trust, and how people decide what matters. And the same is happening now in software.

The arrival of AI as a creative partner and co-author means we’re entering an era where the cost of building a product approaches zero. That doesn’t make building irrelevant. It makes it foundational: a necessary but in and of itself insufficient step. The winners will be the ones who pair that speed with thoughtfulness. Who use the time saved by automation not to crank out more features, but to get closer to the user, to test better hypotheses, to sweat the details that AI can’t see.

It’s easy to see this as a technical transformation, but that would miss the point. The real shift is cultural. It’s about what teams value, how they work together, and how they define excellence when the old constraints no longer apply.

So the task for product leaders isn’t to figure out how to “keep up with AI.” It’s to figure out how to build teams and cultures that thrive in abundance. Where iteration is fast, but purpose is clear. Where tooling is fluid, but standards remain high. Where the focus isn’t just on what we can build, but on what’s actually worth building.

That’s not a future problem. That’s your job right now. And it’s one that still, maybe now more than ever, requires human judgment, human values, and human care.

Brian Root

Brian Root is a seasoned product management executive with a rich history at the helm of digital transformation in tech giants like Amazon and Walmart Labs. As the founder of Rooted in Product, he brings his expertise to early-stage startups and Fortune 100 companies alike, specializing in transforming product visions into reality through strategic leadership and system optimization.

https://www.rootedinproduct.com/brian-root-author-bio
Next
Next

Over-Indexing on Product Delivery Can Cost You Millions