Conway’s Implication
A Thought Experiment
Conway's Law tells us that the systems we create tend to mirror our organizational communication structures. One reason this holds true is that organizations are made up of unique individuals, each bringing distinct knowledge, skills, and communication styles. But what happens when AI begins to assume all roles in the product development process, acting as product manager, designer, and engineer?
If this comes to pass, we’re not just witnessing an evolution in product development; we’re staring at an extinction event for traditional workflows. With AI serving as both strategist and executor, organizational communication would become highly structured, frictionless, and nearly instantaneous. But what does this really mean?
First, if AI can replace human roles in product development, it suggests that AI won’t be merely automating tasks — it will be making complex, multi-faceted decisions. The speed of iteration would be unprecedented, limited only by processing power and data availability. In theory, this could accelerate innovation, but it also would compress the space for deliberate thought and serendipitous creativity. When the time from concept to launch shrinks to near-instantaneous speeds, the potential for experimentation becomes near-infinite, but the significance of each individual experiment is likely to diminish. Why take the time to form a hypothesis when you can test a thousand variations in real-time and let the data dictate direction?
This leads us to a critical implication: AI-driven product design would become utterly dependent on a constant influx of large-scale data. Products wouldn't just be built for human needs; they’d be built around user engagement loops, optimized for interaction rather than intention. We already see early signs of this in platforms like TikTok, where content is tailored by algorithms to maximize retention rather than enrich human experience. In an AI-first development cycle, products might no longer serve a higher purpose beyond engagement itself. Would AI shape products to fit human needs, or reshape human behavior to fit the needs of AI?
With AI cycling relentlessly through ideas based on probabilistic outcomes, the “human” touch in product development risks becoming a relic. What once emerged from the messy, inspired collaboration of cross-functional teams may be replaced by purely data-driven design. AI can optimize for clicks, retention, and conversion rates, but what happens to emotional resonance? Cultural nuance? The ineffable qualities that make a product feel right? These elements don’t fit neatly into a predictive model.
Now, let’s consider what this AI-driven product development landscape might actually look like. The immediate benefit is undeniable: a dramatic increase in productivity and speed. This is why Big Tech and venture capitalists are pouring more money into AI than the GDP of entire nations. Projects that once took months or years could be completed in mere days, and the allure of such efficiency is impossible to ignore.
But with such speed comes a hidden cost: homogeneity. If AI iterates by continuously testing and optimizing toward what works best, then it will inevitably converge on a set of “proven” formulas. The outliers — the oddball, the opinionated, the deeply personal — risk being smoothed away in the relentless pursuit of engagement. Products may start to feel eerily similar, optimized to an almost mechanical perfection but devoid of soul.
Then, of course, there’s the ethical dimension. AI isn’t neutral; it reflects the biases in the data it’s trained on. If AI is making all the design decisions, whose perspectives are being amplified, and whose are being erased? The problem isn’t just that AI might reinforce existing biases, it’s that these biases could become embedded so deeply into digital experiences that we stop noticing them altogether.
Perhaps the most unsettling thought is this: in a world where AI governs product development, we risk becoming mere test subjects in an infinite loop of experimentation. Imagine labyrinthine products designed not for us to accomplish our task and move on, but for us to be endlessly guided through, nudged from corridor to corridor by algorithms measuring every aspect of our digital lives. The products of the future might not be built to empower us but to extract maximum engagement, fine-tuning our behaviors to fit their needs rather than the other way around.
As we move into this potential future, we must ask: are we still the users, or have we become the product?