John Loeber has an essay worth reading: "Bring Back Idiomatic Design." He's advocating for a return to idiom: "interactions with applications should make sense, gosh darn it, now get off my lawn." He's suggesting a return to interfaces that have common elements, because that's what helps humans use those applications, and he's not wrong.
The desktop software era had homogeneous interfaces - consistent menus, consistent keyboard shortcuts, consistent visual affordances - and the browser era broke all of that (and it's gotten worse as we've added platforms like mobile and artificial reality.) Two things stand out for him: the transition to mobile, which required reinventing every interaction pattern for touchscreens, and HTML being stretched far past its original document model, so that modern web applications have no structural obligation to honor the idioms HTML would have enforced. His prescription is to follow existing idioms wherever possible, use words over icons, and make interactive elements obviously what they are.
He's right about the diagnosis (and, probably, the fix: humans like consistency. We don't want to learn how to turn on the lights in completely different ways for every different room on every different day.) The framing worth adding is why the idioms existed in the first place — and why their loss was structurally inevitable once the ground changed.
IBM's Common User Access standard, introduced in 1987 as part of Systems Application Architecture, didn't invent UI conventions out of thin air. It looked at what was already working across applications, drew a line in the sand, and said: this is the contract. File -> Save. Edit -> Undo. ALT+F4 to close. Most applications in the ecosystem conformed, and everyone won - developers got a playbook, users got transferable expertise. Learning one well-designed application made you better at all of them. Having a standard meant that changes could be applied via the standard, but the standard served as a flywheel that gave everyone a sort of base-level consistency.
CUA worked because it was enforced. The OS was the coordination layer. Applications that ignored the standard were out of compliance with the platform, and had to justify their aberrations by being extraordinarily useful, or they had to figure out how to comply. And crucially, the hardware was fixed: a keyboard with a Control key, a mouse with buttons, a display with known dimensions. The contract was written for a specific physical context, and that context was stable.
We had a line in the sand... and then the sand changed.
^C and ^X don't work when there's no Control key. This sounds obvious stated plainly, but it's the entire problem. The idioms of the desktop era were inseparable from the hardware they ran on. CUA didn't just assume a keyboard - it assumed that keyboard, with those modifier keys, generally in that physical relationship to the user's hands. Move to a touchscreen and the contract is void. Not just broken, void. You're not in the same environment anymore.
This is why the Commodore 64 Ultimate reissue has a reasonable shot at success among its target market that free emulators, however excellent, can't fully replicate. The emulators are technically faithful, but the Commodore 64 keyboard was part of the interface - the specific feel of those keys, the layout, the function key placement along the right side rather than the top. Muscle memory built against that hardware doesn't transfer to a USB keyboard running VICE, no matter how accurate the software is. The idioms were embedded in the physical artifact. You can emulate the software; you can't emulate the hands that learned it.
The same dynamic at scale explains the current mess. Loeber is right that mobile forced a reinvention, but it's more than that: every significant shift in deployment environment potentially voids the existing contract. Desktop, browser, touchscreen, voice, AR - each is a different physical context with different affordances, and idioms that crystallized in one don't automatically survive the move to another. We're not in a period of design laziness; we're in a period of genuine divergence because the environments themselves are divergent.
There's a third factor Loeber doesn't address: AI-generated interfaces are making this ... weird.
Models trained on the web learn from the web as it is - which means they've ingested the full heterogeneous chaos of a decade of divergent design. When you ask an AI tool to generate a UI, you're asking it to synthesize from a training set that includes every contradictory pattern that ever shipped. The result tends toward the generic and the recombinant: interfaces that look like interfaces, that feel vaguely familiar without being idiomatically consistent with anything in particular. Worse, the models have absorbed all the bad patterns alongside the good ones, with no strong signal distinguishing them.
The risk isn't that AI generates ugly interfaces. It's that AI generates confidently average interfaces - interfaces that satisfy the prompt without advancing toward any new coordination. They don't inherit CUA's discipline. They don't predictably experiment toward new idioms. They recombine what exists. In an era when we need either genuine adherence to existing idioms or genuine experimentation toward new ones, "confidently average" is the worst outcome.
Here's the thing about CUA itself, though: it wasn't handed down. It came after a prior period of chaos - incompatible systems, inconsistent navigation, applications that each invented their own interface from scratch. IBM looked at the mess, identified the patterns that were already winning, and formalized them. The coordination layer came after the experimentation, not before.
Which means the current chaos is not the disease - it's the process.
Pull-to-refresh didn't exist before Loren Brichter put it in Tweetie. It felt wrong for about ten seconds, then it was everywhere, then it was invisible: a new idiom, bootstrapped from nothing, calibrated to the specific physics of a touchscreen and a thumb. Someone had to cover new ground to discover that the right gesture in the new environment wasn't a downward click but a downward drag. The chaos was the precondition.
Games have often served this function, whether intentionally or not. A joystick and a number pad can drive an interface that no word processor would dare ship. The stakes are different - a confusing game control scheme costs you players, not payroll, and a game can pivot without the same impact a new CRM version would have - and that lower-stakes environment is exactly where novel idioms can get stress-tested before the real world inherits them. When a game mechanic that became a gesture then becomes a standard, that's the pipeline working.
The productive question isn't "how do we restore CUA" - you can't, because the hardware isn't the same. It's: if you have to put two things together without the tools we used to use, what tools work now? Which ones are worth formalizing? What's already working well enough that someone should draw a line in the sand and say, "this is the contract"?
We're improvising on a daily basis, in a lot of ways. We have to navigate new ways to interact, plus we're doing it without a centralized discipline - the people best positioned to experiment aren't the people best positioned to evaluate, and vice versa.