Automation Isn’t the Problem. Abandonment Is
Automation is not the threat people often make it out to be. The real risk lies in how poorly society supports people through periods of change.
Author’s Note
Sitting in the pub with some mostly Yorkshire people after a friendly kayaking session at the pool. I usually avoid any talk of political conversations, partly out of fear that my mouth will not keep up with my thought process, and partly because I know my limited vocabulary can make things come out wrong. I don't want to be misconstrued, but I often am.
It's impossible to ignore the context here. Many towns around us have suffered greatly for many decades due to the decline of the the mining industry. Change has been lived, abandonment has been felt.
A good friend of mine was complaining about the self service checkouts in shops, and his complete frustration for them, and how they are replacing peoples jobs. To him, they represent jobs being taken away and communities losing more of what little security and socialising they have left. I felt frustrated in response, because in my mind we always want to improve productivity for both people and a society, to free up time for more meaningful relationships, self improvement and leisure opportunities. Throughout millennia society and jobs have changed, and we have adapted.
What I failed to communicate was that these two things are not in opposition.
Debates about automation tend to collapse into false binaries. You are either “pro-progress” and indifferent to job losses, or “pro-worker” and suspicious of technology. This framing is wrong, historically, economically, and morally.
I struggled to express this properly in the moment, and I probably sounded heartless as a result. That sat with me afterwards. Not because I changed my mind, but because I realised I had not said what I meant, or said it in a way that respected the weight of his experience.
Where the Problem Actually Lies
Societies have always evolved their work. The question has never been whether jobs change, but how people are supported while they do.
Self-service checkouts are a useful example. They are often treated as a symbol of dehumanisation. Machines replacing people, profits rising while communities hollow out. But the real issue is not that machines exist. It is that the gains from automation are captured by a narrow group, while the risks are pushed onto workers and consumers alike.
Automation increases productivity. This is not controversial. Higher productivity means fewer labour hours are required to achieve the same outcome. Historically, this has been a good thing. It freed people from agricultural drudgery, enabled mass education, and underpinned modern living standards.
Blocking productivity growth does not protect the vulnerable. It raises prices, reduces competitiveness, and ultimately entrenches inequality. Progress delayed is not progress denied. It is progress made more painful later.
The problem emerges in the transition.
Technology moves quickly. Institutions move slowly. People live in the gap between them.
When a checkout becomes automated, the cashier does not instantly become a software engineer. Skills are sticky. Identity is sticky. Retraining is risky, especially for those with dependents, health constraints, or limited time. Resistance to automation is therefore not irrational. It is often a rational response to being asked to absorb systemic risk alone.
This pattern is not new. We have seen it repeatedly, from agriculture to industry, from industry to administrative work, from administrative work to digital systems. Each shift eventually stabilised. New jobs emerged. Productivity rose. Living standards could improve.
But each transition also produced a lag, and it is during that lag that harm concentrates.
Modern societies often behave as if this lag is an unfortunate but unavoidable side effect. It is not. It is the result of choices.
Education systems are still built around the assumption that people choose a career early and remain in it for decades. Training is front-loaded, inflexible, and increasingly detached from economic reality. Mid-life retraining is treated as exceptional rather than normal. Failure to adapt is moralised rather than mitigated.
This is backwards.
In an economy defined by continuous technological change, adaptability is infrastructure. Training should not be a personal gamble. It should be a shared responsibility. Education should prioritise learning how to learn, digital literacy, and systems thinking, not narrow role preparation for jobs that may not exist in ten years.
Concerns about automation are often framed as opposition to progress. More often, they are objections to asymmetric benefit. When automation reduces labour costs, the surplus rarely flows to workers or customers. It accumulates elsewhere.
Self-service checkouts make this visible. Labour is not eliminated. It is reassigned. Customers scan, bag, troubleshoot, and absorb delays. Staff numbers fall, but prices rarely do. Efficiency increases for the firm, not necessarily for society.
This is not a failure of technology. It is a failure of distribution.
Change Requires Participation
None of this removes personal responsibility.
Structural support matters, but it does not replace individual agency. People are not powerless in the face of change, and treating them as such creates its own problems.
The world has always changed, and it will continue to do so. When those changes are visible, there is also a responsibility to respond to them where possible. Waiting to be carried forward by institutions alone is rarely enough. Motivation, curiosity, and a willingness to adapt still matter.
This is not about blaming people who struggle. Many do not have the time, health, money, or stability needed to pivot easily. But where people do have capacity, ownership matters. Taking responsibility for learning, reassessing assumptions, and responding to change is part of living in a society that does not stand still.
Personal agency and social support are not opposites. They reinforce each other. Support lowers the cost of adapting. Ownership determines whether that opportunity is taken.
A society that demands adaptation without support is cruel. A society that offers support while denying agency is infantilising. Both fail in different ways.
Progress works best when people are given the tools to change, and are also expected to use them.
Trying to preserve specific jobs simply because they exist is the wrong goal. Justice is not about protecting tasks. It is about protecting people.
If society can meet its needs with less human labour, the ethical response is not to invent inefficiencies. It is to reduce the cost of change and broaden the benefits of progress.
Automation is not the threat. Abandonment is.
Practical Implications
This is not a call for a single ideology or a grand redesign. It is a call for alignment between how fast technology moves and how well people are supported.
Some principles follow naturally.
Training as a right, not a risk
Retraining should be accessible, funded, and normalised across adult life. It should not be something people gamble their income, identity, or stability on. If society expects people to adapt to change, it has a responsibility to lower the personal cost of doing so. At the same time, where people are able to engage with retraining, there is an expectation that they take that opportunity seriously rather than wait for change to bypass them.
Education for adaptability
Education systems should prioritise transferable skills. Learning how to learn, digital literacy, and comfort with change matter more than preparation for specific roles. This applies as much to adults as it does to young people. Adaptability works best when it is treated as a lifelong process rather than a one-time phase at the start of a career. Change is also easier to respond to when it is signposted honestly rather than framed as sudden or unavoidable.
Shared gains from productivity
When automation reduces labour requirements, the benefits should appear somewhere tangible. Shorter working hours, better pay, lower prices, or stronger public services. When those gains accumulate primarily as private wealth, frustration grows and trust erodes. Companies that benefit most from automation rely on public infrastructure, education systems, and social stability, and it is reasonable that they contribute fairly to sustaining them. Responsibility should scale with influence. Individuals, institutions, and companies do not carry the same weight, and should not be treated as if they do.
Transition support, not moral judgement
People who struggle to adapt are not failures. They are navigating a system that externalises risk. Support should be designed accordingly. But support should not remove agency entirely. Reducing the cost of change makes participation possible. It does not remove the need for people to engage with change where they have the capacity to do so.
Technology as a tool, not a shield
Automation should improve outcomes for people. It should not act as cover for cost-cutting that degrades work, service quality, or accessibility. Tools should be judged by the results they produce, not just by the efficiencies they promise.
None of this requires rejecting progress. It requires taking responsibility for it.
We have navigated economic transformations before. We will again.
The difference between success and harm lies not in whether we change, but in whether we leave people behind while we do.
Note on Process
I use AI tools as part of my writing process, including for this post. They help me iterate on ideas, test phrasing, and turn rough or fragmented thoughts into something more coherent. I do not treat them as authors or sources of opinion, but as tools for refinement and clarity.
As a neurodivergent writer, I often know what I think long before I can explain it cleanly. Writing can be slow and frustrating when trying to connect complex ideas without losing nuance. These tools help me bridge that gap. The arguments, positions, and conclusions here are mine, even when the wording has been shaped through iteration.