AI Fails WHEN WE DON’T CONSIDER How Work Is Done

We spend a great deal of time talking about what technology can do.
Far less time defining how it fits into real workflows and real people’s lives.

This isn’t a new problem, and it isn’t unique to AI.

One of the most consequential shifts in the early internet wasn’t purely technical. It was the move from systems designed around engineering logic to experiences shaped around human understanding. Early websites and enterprise software often worked, technically speaking, but they were difficult to navigate, unintuitive, and unforgiving. To get anything done, people had to learn the machine’s logic first.

What changed wasn’t just better infrastructure; it was the introduction of design practices that asked how people actually worked, what they needed, and where friction lived. Ease of use — and the explosion of digital services that followed — came as much from better experiences as from better technology. Those experiences emerged by taking human stories seriously.

That same dynamic is playing out again now.

I’ve heard a framing recently that has stayed with me: one of the primary reasons organizations aren’t seeing meaningful return on their AI investments isn’t model performance or tooling limitations. It’s that they aren’t identifying and implementing use cases that genuinely add value to the organization.

Teams often stay close to obvious, low-risk applications like drafting emails, summarizing documents, automating narrow tasks. These uses can save individuals time, but they rarely translate into systemic improvement. Time savings at the individual level don’t automatically compound into organizational change.

What gets missed are the larger opportunities. The ones that only surface when we step back and examine how work actually happens.

And this isn’t just about AI.

I saw this recently while working with a team on a product where AI was part of the roadmap, but not the focal point. The system itself was already going to shape how people made decisions, interacted with information, and moved through their day. AI simply raised the stakes.

The team arrived with a thoughtful, well-structured set of technical requirements. Everything was defensible, but instead of optimizing immediately toward those specs, we paused and asked a different set of questions:

  • When, where, and why will people interact with this system?

  • What does their day look like now — and how might it change?

  • Which tasks become easier, which become harder, and which disappear altogether?

  • Where does friction move, rather than vanish?

  • How do these components come together into a coherent experience?

That shift — from capabilities to lived experience — changed the nature of the conversation. It moved us away from “Can we build this?” and toward “What problem are we actually solving?” and “What kind of work are we enabling?”

This kind of inquiry matters anytime technology comes into contact with people, whether it’s AI, automation, or a new internal system. Bolting technology onto existing workflows without interrogating those workflows doesn’t produce transformation. It preserves existing constraints, and often amplifies them.

Despite how it’s sometimes framed, this work isn’t soft or abstract. Making deliberate decisions together about how work should change is how meaningful use cases surface at all. It’s how teams build shared understanding, align on priorities, and avoid solving the wrong problems very efficiently.

The irony is that we already have many of the tools required for this moment. Design research. Systems thinking. Facilitation. Scenario-building. Sense-making. These practices were essential in making earlier waves of technology usable, scalable, and valuable.

Human stories aren’t a soft add-on to technology strategy.
They’re the connective tissue that makes adoption coherent.

Without them, we risk repeating a familiar pattern: powerful systems that technically work and practically fall short.

Previous
Previous

THE FRAGMENTATION TRAP

Next
Next

AI Is a Systems (Not Solutions) Problem