Monday morning, and your new colleague is clocking in. They’re keen, obedient, incredibly fast – and they don’t need a lunch break.
It’s 2025 and AI tools have transformed the workplace. According to McKinsey’s latest Global Survey, 72% of organisations have adopted AI in at least one business function and 65% are using generative AI regularly. And that’s just the AI they know about. Last summer, Deloitte reported that almost a third (31%) of UK employees who use GenAI for work use publicly available tools they personally pay for, and almost half (48%) use free publicly available GenAI tools.
The applications of AI at work are vast. From help with drafting emails and assisting with research to transcribing calls and developing business strategy, the opportunities are almost endless. There’s even growing talk about the ‘death of the junior developer’, as Large Language Models (LLMs) take on coding duties.
But these productivity gains can carry a secret price. Beyond fears of job replacement, growing artificial intelligence risks decreasing our own skills. A new study published by researchers at Microsoft and Carnegie Mellon found that relying on AI can inhibit critical engagement with work, and that GenAI use has the potential to diminish independent problem-solving. Without a considered approach to AI tools, we risk sacrificing mental acuity in our pursuit of innovation.
With use only set to increase, CTOs and tech leaders have an obligation to ensure their teams’ skills stay relevant and valuable in the AI era. This is especially important as we train up the next generation of developers. Gen Z apprentices and graduates coming into the tech industry are AI natives and their affinity with the tools of tomorrow makes them incredibly valuable. But we need to ensure this ability is combined with the right training so that crucial skills aren’t lost.
To this end, in my team we follow the rule “learn to do it yourself first, before allowing AI to step in”. You should never use AI to write code you couldn’t write yourself. This rule is about more than encouraging teams to expand their knowledge. If you don’t understand the code, you can’t check it. And that’s critical when AI still regularly makes mistakes.
GenAI messes up simple coding tasks all the time. But the real danger is when it makes a confidently incorrect recommendation. A classic example is when it presents two ways to do something, explains both, and then recommends the less efficient one. A junior developer might take its explanation at face value, while an experienced developer has the intuition to challenge it, despite its confident tone.
AI can be of incredible assistance in coding: cutting the time needed to document code functionality by as much as 45-50% and reducing completion time for writing code by 35-45%. But it should be used to support, not replace human work. Developers still bring immense value to the table through their creativity and critical thinking; allowing this to fade is something neither individuals nor the industry can afford.
Skill fade can be prevented in a number of ways. Tech leaders who want to protect their teams from skills obsolescence should find ways to develop employees’ critical thinking skills and creativity, such as running ‘hack days’ to encourage individuals to step outside of the daily routine and build something new. Additionally, they need to establish a culture of transparency around AI in the workplace. Ground rules are needed to ensure that the technology is mastered, not relied upon, and education can help employees understand AI’s pitfalls as well as the value it brings. Finally, team leaders should be pushing engineers to seek out new ways to stay relevant in an LLM-led coding reality, giving them the time, opportunity, and encouragement to pursue professional development and learn the skills needed for the roles of the future.
Among the fears swirling around AI and job loss, the issue of skill loss has escaped much attention. But it is something to be taken seriously. Over-reliance on AI tools risks robbing workers of critical skills that are crucial to their own development as well as that of their organisation. Tech leaders and workers must be proactive in protecting against skill loss so we don’t allow AI to overwrite human abilities.
Paul Maker is Aiimi’s Chief Technology Officer, heading up the Research & Development team, with a particular focus on AI and the capabilities of new and emerging technologies.
Get daily updates and enjoy an ad-reduced experience.
Already have an account? Log in
The founder of Lastminute.com has warned that the tech sector is in “peril” as US President Donald Trump cracks down on equality programmes.Baroness Lane-Fo
The Financial Times has released its FT1000: Europe’s Fastest Growing Companies 2025 ranking. The list ranks companies by compound annual growth rate from 202
London fintech Wise has opened a new office in London as the fintech marks the next chapter in its global expansion. After more than eight years at the Tea
One of the UK’s biggest small business lenders said it had seen “heightened demand” for loans as it reported a rise in sales. London-listed Funding Ci