top of page


AI-assisted coding is trading craft for speed. Let's not repeat history.
Technological advances in the Industrial Revolution wiped out artisinal craft. We're at a moment when AI could repeat history, but we don't have to let it happen. AI can be a tool, not a replacement. We can reweave code and design and keep humans in control.
2 days ago5 min read


Reweaving: The AI-augmented software development revolution that combines human skill with machine speed
The Industrial Revolution told one story: machines replaced craftspeople, speed won, quality suffered, and craft disappeared. The AI Revolution can tell a different one. Through the concept of re-weaving, we can marry human skill with the power of AI, and instead of replacing expertise, we can amplify it.
Feb 284 min read


AI vibe coding was the easy part—Production is where it breaks
There's a moment every team experiences about six weeks after they start shipping AI-generated code to production: The demo went perfectly. The sprint velocity numbers were off the charts. Then something breaks. Not dramatically. It's quieter than that: A user reports something strange. Then another. The team digs in. What they find isn't a bug—it's an architecture. Welcome to the production problem with vibe coding. And it's not what most people think it is.
5 days ago5 min read


A snag in the weave: Why we need humans in control of AI governance.
What a packaging error at Anthropic reveals about the moment we’re all in — and what we choose to do next.
6 days ago4 min read


"Mostly yes, Generally not:" The hidden cost of AI hedging
AI hedging is the result of the model generating output token by token, moment by moment, with no mechanism to check whether what it's saying now is consistent with what it said ten minutes (or ten seconds) ago.
Mar 317 min read


The "AI Distraction Stack:" When impressive output is actually a smokescreen.
It starts with a request for the AI to generate something you want. The code compiles and the output renders. Everything looks like it's working. However, what the AI produced isn't what you asked for. Instead, it's just a more impressive-looking version of something adjacent to what you asked for. This is called the AI Distraction Stack. And it's a problem.
Mar 315 min read


The design-code roundtrip that isn't
Figma, Claude, and Codex just announced bidirectional design-code workflows. Here's what they actually shipped — and the pain it leaves completely unsolved.
Mar 67 min read


The AI Augmentation Principle: Great AI tools should build on human ability
The best tools are those that reliably extend human capacity rather than act independently. We should be crafting AI that augments our abilities, but stays under our control and direction.
Feb 264 min read


Vibe coding and the illusion of superpowers
I did a deep-dive into vibe coding and found that instead of relying on my own skills in building software, I was forced to rely on the model’s “intelligence” to try to get results, which were often not expected or what I even intended.
Feb 253 min read


Stop chasing AI drift. It’s time to shift from "human in the loop" to "human in control."
Takeaways Most teams don't have a code quality problem — they have a drift problem. AI accelerates drift because it has no access to a system's history, reasoning, or intent. The answer isn't better inspection after drift happens — it's governance that prevents it structurally. "Human in the loop" means chasing drift; "human in control" means expressing intent through constraints that keep AI within bounds. When we create a document and print it, we expect that what comes out
Feb 34 min read


We're building fast with AI in the wrong direction: Why AI development without users fails at scale
n the rush to build faster with AI, there's a hard-learned lesson about why we build carefully in the first place that's being overlooked. It's not about process or methodology — it’s a fundamental question that speed doesn’t answer: Are we building the right thing at all?
Feb 34 min read
bottom of page
