Generative Friction: the new psychology of creative work
How generative tools shift the cognitive burden from making to choosing— and what it means for the future of work
This is the first in a series exploring Generative Friction—the behavioural shifts taking place in creative work when the act of making becomes effortless, but the act of deciding does not. As generative tools proliferate, they promise speed, ease, and limitless iteration, but beneath the surface they quietly reshape our mental models of authorship, choice, and satisfaction.
This series examines how that shift plays out in practice. From decision fatigue to pseudo-agency, from anchoring effects to the cognitive cost of refinement, these are no longer edge cases of design friction—they are the new psychology of creative work.
The changing nature of friction when we go from scarcity to surplus
There was a time when the hardest part of creating was simply getting something made. Production was the constraint—whether that meant typesetting, editing film by hand, or iterating a product one prototype at a time. The friction was material, visible, and often linear. Time and labour shaped what was possible, and in doing so, narrowed what needed to be decided.
That’s changed. Generative AI tools, in particular, have chipped away at production limits. Whether drafting text, composing visuals, or generating code, these systems can produce a wide range of outputs at a scale and speed that would have been unimaginable just a few years ago. What once took hours can now be done in minutes—or prompted into existence with a few well-placed words.
But rather than freeing us from friction, this shift has introduced a subtler cognitive weight. As production becomes faster and easier, the nature of friction changes—less about making, more about managing the weight of choices. The challenge now often lies in navigating a growing field of options, deciding which direction holds value, and recognising when refinement stops being productive.
Cognitive load and the new creative fatigue
This shift from production to decision brings more than logistical consequences. It gradually reshapes how we think—changing when decisions happen, what informs them, and how much weight they carry in a landscape of proliferating outputs. Instead of moving through a process from idea to execution, we’re often flooded with outputs at the outset. Dozens of options appear before a single decision is made. Iteration happens before intent is even fully formed. And with each additional variation, the pressure to discern—rather than simply generate—grows heavier.
Sometimes this abundance is energising because it invites exploration, expands the sense of what’s possible, and lowers the cost of experimentation. But it can also lead to a loop of refining, tweaking, adjusting without end. Not because the work demands it, but because the tools allow it—and once a better version always seems within reach, finality begins to feel arbitrary.
The marginal cost of generating new outputs is near zero, but the cognitive cost of evaluating them compounds. Satisfaction diminishes. Finalisation stalls. The refusal to select is often misread as perfectionism, when it may be better understood as a defence against anticipated regret—a discomfort fuelled by the awareness that more options will always follow.
The shifting creative role from making to filtering
While these patterns can be observed at the individual level, they also reflect broader dynamics in how systems shape cognition. When tools are designed to output at scale, they subtly reorient effort away from expression and toward evaluation. Creative roles begin to shift. Writers become selectors. Designers become filters. Strategists become curators of system-generated options.
The skillset has expanded. It now includes not just the ability to generate ideas or outputs, but the capacity to navigate abundance—to identify which possibilities carry weight, which ones are system-generated variations, and which may only appear promising because they fit familiar patterns the model has been trained to reproduce.
The invisible frame of being anchored by defaults
And yet, that recognition rarely happens in isolation. System-suggested options—what's surfaced first, what aligns with past patterns, what feels familiar or frictionless—tend to anchor the process. Even when we believe we’re choosing freely, our judgement is scaffolded by defaults we didn’t necessarily notice.
While we generate the outputs, the structure of the decision space is often influenced by system defaults—what appears first, what aligns with previous behaviour, what feels instantly usable. These cues create a sense of autonomy that can obscure the subtle ways in which judgement is being guided. It's a new form of the illusion of control: a pseudo-agency where choice feels autonomous but is actually constrained by the invisible architecture of generative systems.
Entangled agency and the disappearing author
This raises a broader question: what does discernment look like in a system that never stops offering alternatives? How do we define intention when the tools we use are built to multiply, not converge?
These concerns reflect more than surface-level discomfort. They suggest a gradual but consequential change in how our thinking takes shape—less through solitary reasoning, and more through interaction with tools that anticipate and shape our responses.
As the effort required to create diminishes, selection plays an increasingly prominent role—often operating alongside creation in recursive cycles. Each new iteration invites another judgement, and each act of choosing can prompt further refinement. The cognitive work shifts not from one domain to another, but into a more entangled relationship between making and discerning.
When systems shape the landscape of available choices—highlighting some paths, obscuring others—they inevitably influence not just what we choose, but how we arrive at those choices. Over time, this can alter our sense of authorship, making it harder to distinguish between conclusions we reached deliberately and those we accepted because they were placed in front of us.
AI tools do not just assist cognition; they co-constitute it. Creative work becomes entangled agency, not solitary authorship.
What comes next
Upcoming pieces in the Generative Friction series will explore this boundary in more depth—especially in the context of generative AI tools. They’ll examine how system-suggested outputs shape judgement, how iteration alters the nature of authorship, and what it means to sustain clarity when the path of least resistance is endless generation.
Fascinating line of thinking, thank you! I find this very apt:
“When systems shape the landscape of available choices—highlighting some paths, obscuring others—they inevitably influence not just what we choose, but how we arrive at those choices. Over time, this can alter our sense of authorship, making it harder to distinguish between conclusions we reached deliberately and those we accepted because they were placed in front of us.”
Are we nudging the machine or is the machine nudging us?