While employer awareness of whether AI is likely to have a positive or negative impact on individual employee’s mental health and wellbeing may not be particularly high at the moment, it is vital that action is taken to prevent an already problematic workplace mental health situation from getting worse as AI continues to take hold.
One growing challenge in this context, for instance, points out Dr Shonna Waters, Co-Founder and Chief Executive of human capital performance advisory and consulting firm Fractional Insights, is “behavioral spillover”. She is seeing three interconnected dynamics here that are having an impact on human relationships – the most important predictor of happiness and life satisfaction, according to a long-running Harvard study.
The first dynamic is that chat-based AI is training employees to expect:
Immediate compliance without negotiation
No emotional resistance or pushback
Perfect responsiveness without relational maintenance
Output without the “mess” of human feelings.
As a result, Waters says:
When humans don’t perform like software because they need context, push back, have bad days, or require emotional attunement, we’re increasingly experiencing that as friction rather than normal human interaction.
Opting out of emotional complexity
This scenario correlates with the second dynamic, she says:
We’re losing patience for the things that make humans human: ambiguity, emotional complexity, the need for relational context. A senior technology executive recently told me: ‘What I love about AI is that you remove culture so there’ll be less drama.’ Recent data from Upwork shows that 64% of top AI performers say they have a better relationship with AI than with human co-workers. They find AI more polite, more trustworthy, and more agreeable. This isn’t about efficiency. It’s about emotional avoidance. We’re not just learning to command AI, we’re learning to prefer interactions that don’t require emotional labor.
The third dynamic is that too many employers are designing systems based on how they would like employees to behave rather than the reality of what they do. This situation shows up in three ways, Waters believes:
Reduced context-giving: We expect others to ‘just know’ what we need
Intolerance for emotional processing time: We want immediate responses without putting in the relational work
Treating pushback as malfunction: When humans disagree or need clarification, we experience it as system failure rather than valuable input.
The upshot of all of this, says Waters, is that:
We’re not just choosing AI for efficiency. We’re choosing it because it lets us opt out of emotional complexity. The question isn’t whether AI can replace human tasks. It’s whether repeated interaction with command-response systems is eroding our capacity for the ambiguity, negotiation, and emotional attunement that human collaboration requires.
Supporting relationship development
So, what can employers do to ensure their workplaces are optimized to support emotionally healthy humans who can thrive in relationship with others, even as the automation levels around them steadily increase?
The first consideration is to embed social cohesion and belonging into the way people work. This is about promoting a shared identity, community ritual, and inclusion practices that bring teams together. Doing so is vital, according to a recent Harvard Business Review report entitled ‘Loneliness is Re-shaping Your Workplace’, because:
It creates psychological safety, reduces friction, and generates a sense of collective purpose and energy. Social cohesion isn’t about uniformity but about ensuring people feel like they’re seen and their contributions are valued.
The second factor here is providing employees with opportunities to build meaningful interpersonal relationships. These relationships are based on trust-based connections that go beyond people simply undertaking work tasks together. Such connections are important, the report indicates, because:
When team members genuinely know and trust each other, they communicate better, solve problems faster, and are more likely to innovate together, thus making deep relationships key in combating loneliness [a key contributor to mental health problems] and increasing productivity.
The importance of leadership
But Waters also believes leaders have an important role to play in creating a workplace culture where employees feel psychologically safe by becoming “architects of trust”:
Our mental health and wellbeing are a byproduct of scanning the environment to understand if it’s safe, and building trust is one of the things leaders can do to positively impact that.
There are three levers here, she argues:
Credibility: People are pretty good at sniffing out if something’s not right, so it’s best to admit if you don’t know something;
Integrity and the say-do gap: It only makes the situation worse if you try to communicate messages that sound good, but you behave in a way that isn’t consistent;
Benevolence: There’s so much rhetoric with AI around efficiency and productivity that can leave employees feeling like ‘tools’. It’s very transactional language. People know it’s part of the game, but it needs to be balanced with messaging and metrics that are about designing for wellbeing and the employee experience. It matters that people think you have their best interests at heart.
Wende Smith, Head of People Operations at HR platform vendor BambooHR, agrees:
Leaders have to approach this chaotic environment with a strategy and planning, so they feel more in control rather than simply having to react. It’s continual change that’s causing stress, so it’s important to temper that with clarity over how AI will help us achieve our business goals and how it will help employees be better at their jobs.
Taking the ambiguity out
One approach Smith has taken is to develop a clear policy so that people know exactly when and how they should be using AI or not. Chief executive Brad Rencher has personally endorsed the technology’s usage within the business and highlights successful use cases at each monthly company update meeting.
Dedicated AI champions have also been created for each department. They are fed best practice from an AI Governance Committee and meet weekly to discuss what is working and what isn’t.
This information is then provided to divisional leaders. They spend time at each quarterly Leadership Summit discussing how effectively AI is operating in their department. The aim here is to help build an understanding of how best the technology can be capitalized on across the business. As Smith says:
It’s about giving people permission and ensuring they feel supported, that we understand mistakes happen and that we need to learn from them…It goes for pretty much everything, but if you’re not clear and open, it creates stress, even if you don’t have all the answers. So, communication is the number one thing that you can’t do enough of. People are different in how they adapt to change and AI is a big disrupter. It all goes back to having a strategy around it and communicating that, so everyone understands where they fit.
Waters agrees:
Training is important, but not more important than clarity. Organizations are spending millions on skills development, but 70% of employees say the biggest barrier to using AI confidently is understanding what the rules are. So, start with clarity and move onto training. Provide clear guidelines on acceptable use and embed the technology into workflows so the ambiguity is taken out.
My take
The impact AI is having on employees’ mental health is much more complex than just worrying about losing their jobs. This means employers really need to take a much more sophisticated approach to wellbeing than has previously been the case with other tech deployments. If they fail to do so, the consequences could well be dire.