Why leaders need to build resilience to avoid AI burnout
Usually, talk of stress in the office is focused around employees being overloaded and overwhelmed – even more so as AI adoption accelerates. The wellbeing of those in leadership positions tends to go under the radar.
It’s a quiet crisis that is becoming more prevalent. According to DDI’s Global Leadership Forecast 2025, 71% of 10,796 leaders have reported their stress levels rising since taking up their current role. This is an uptick from 63% in 2022, the consulting firm’s research found.
The leaders surveyed cited attracting and retaining talent as the main reason for their stress (54%). Successfully leveraging AI (29%), ensuring their workforce remains engaged (28%) and digital transformation (25%) were other major challenges keeping them up at night.
Stress levels across IT and security teams were high enough before AI adoption accelerated, but now tech leaders are under sustained pressure, says David Bennett, CEO of backup storage specialist Object First.
AI has increased the surface area of cyber risk – and the fear of being targeted and being blamed for an attack has led to an-always on operating model. On top of this, leaders are finding themselves squeezed for time. The DDI data shows that only 30% of respondents felt they had sufficient time to perform their role to a satisfactory level. This combination is leading to exhaustion and impacting performance.
Building resilience into leadership
When leaders underperform or show signs that they’re struggling, this can have a knock-on effect on employees. Only 29% of those surveyed by DDI trust their managers, a sharp decline from 46% in 2022. If employees don’t trust their leaders, then they’re less likely to place trust in the AI systems that leaders have introduced into tech stacks. The onus is on leaders to be more resilient.
Janthana Kaenprakhamroy, CEO and founder of insurance technology firm Tapoly, argues that, to build resilience, leaders need to start by reframing their role.
AI has moved from experimental sandboxes to a key driver of growth that is subject to regulatory scrutiny and board-level accountability, says Kaenprakhamroy. Getting AI right, or wrong, can have an impact on the top and bottom lines, so leaders feel obligated to get it right from the start. This is where exhaustion begins to creep in.
As Kaenprakhamroy puts it: “burnout often happens when leaders feel they must be both a visionary and hands-on operator at all times”. In reality, “leaders don’t need to personally solve every AI problem. Their job is to set out clear strategic intent, define guardrails and empower teams to execute”.
By reframing their role this way, leaders are less likely to be short on time and can put more effort and energy into self-development. More specifically, honing their critical thinking and emotional intelligence skills, which are vital for being able to make clear-headed decisions and operate successfully in high-pressure work environments, i.e. during the initial rollout or deployment of AI.
Bennett adds that “leadership resilience starts with recognizing that this pressure [from AI] is structural, not personal, and that unmanaged stress creates operational risk for their organization”.
Leaders also need to remember not to roll out AI too quickly. The technology isn’t going to have an immediate impact on the top and bottom lines.
“AI transformation is not a single delivery milestone – it’s an ongoing capability shift,” says Kaenprakhamroy. “Leaders who treat AI as a continuous programme with phased goals, clear success metrics and realistic timelines are far less likely to burn out than those trying to force rapid, all-at-once change.”
Leadership resilience strengthens cybersecurity
Leaders who focus on themselves and building resilience can send a signal to their employees that their wellbeing is just as important, especially those in IT and security teams.
“High stress, alert fatigue and the pressure of managing AI-driven systems are contributing to [employee] burnout,” says Bennett. “Protecting people in these roles is inseparable from the responsibility of protecting the business.”
AI systems are designed to reduce constant human oversight, but leaders need to set out clear expectations around accountability and remove blame from incident response. This can help to strengthen employee resilience.
Greg Hanson, group vice president and head of EMEA North at Informatica from Salesforce, adds that leaders need to put up a “strong ‘human firewall’”. This means investing in employees so that they have the knowledge to question AI outputs and the awareness of how data issues impact outcomes. They should also be able to recognize data quality and flag potential issues without needing to refer them to technical specialists.
According to Informatica’s annual CDO Insights study, released at the end of January, the majority of 600 data leaders surveyed globally are concerned about workforce AI and data knowledge. Three-quarters believe their employees need upskilling in data literacy (75%) and AI literacy (74%) in order to use AI responsibly or generate responsible AI outputs.
“Those leaders who invest in workforce literacy and give their people the ability to question and understand AI outcomes will be far better placed to scale it safely and responsibly,” Hanson stresses.
Employees who display confidence in AI can give leaders peace of mind. Those leaders who feel under less pressure are more likely to boost morale and engagement across their workforce. This in turn can bolster employees’ confidence in their own work and use of AI on a daily basis. It’s a virtuous circle that makes leaders and AI systems resilient – and should prevent AI burnout in the long run.