
The Real Barrier to AI Adoption Isn’t Technical. It’s Trust
.jpg)

Only 34 per cent of Australians believe that AI will improve their lives.
I would like to explore why this might be the case.
Over the past year, I've had one theme in every conversation I’ve had with business leaders about AI adoption: It’s not about whether the technology works or whether people understand it.
It’s about trust.
We’ve had tremendous advancements in AI over the last few years, with an incredible array of tools now available for mainstream, daily use. However, Australia lags behind many other countries in its use, advancement, and acceptance of AI.
Uptake in AI has been cautious, to say the least, with many leaders showing scepticism when faced with the concept of AI uptake.
Why? Because in Australia, AI doesn’t yet feel trusted.
Recent data supports this sentiment.
While half of Australians used AI regularly, only 36 per cent were willing to trust it, with 78 per cent concerned about negative outcomes. Coincidentally, we have among the lowest levels of AI training and education, with just 24 per cent having undertaken some kind of AI-related learning compared to 39 per cent globally.
I don’t think this is a coincidence.
This hesitancy is reflected in behaviour. Even when tools like Microsoft Copilot or ChatGPT are introduced in the workplace, they’re often underutilised. Not because they lack functionality, but because people are unsure about what using them signals.
In a culture where fairness, effort, and transparency are deeply valued, the perception that AI is "cutting corners" or "cheating" can be hard to shake.
The 2024 Edelman Trust Barometer focused on “Innovation in Peril”, and it showed that Australians trust scientists and academics, but express growing concern over how companies use emerging technologies. In fact, when it comes to AI, only 15 per cent of people surveyed embraced this innovation.
Without clear guidance and strong communication from leadership, AI can easily become the scapegoat for broader fears about job loss, surveillance, or ethical erosion.
There are real reasons behind these concerns. In the past year, we’ve seen well-publicised examples of AI-linked job displacement.
Canva, for instance, made its first-ever round of redundancies in early 2024 where they let go of most of its technical writing team. Those same employees had been openly encouraged to use AI to support their work, only to find themselves made redundant once it became clear fewer people were needed.
Duolingo and Shopify also made headlines when they declared themselves “AI-first companies,” prompting public backlash over fears that opaque algorithms were replacing human roles.
But here’s the nuance that’s often missed: AI didn’t replace these employees overnight.
Leaders made decisions about how to reorganise work based on productivity gains. The AI itself was just a tool. This distinction is critical.
Because without it, AI becomes synonymous with disposability, not opportunity.
What’s needed now is a shift.
From a technology-first conversation to a human-centred intelligence approach. This idea goes beyond tools and platforms. It asks: how do we design AI adoption in a way that supports, rather than threatens, our people?
It starts with leadership.
Trust doesn’t happen by accident. Leaders must intentionally build the conditions for it. That means being transparent about why AI is being introduced, what it will be used for, and how it will affect roles. It means educating teams on how to use it responsibly. And crucially, it means framing AI as a tool that enhances their own impact, and not replaces it.
Let's look at the introduction of the calculator as an example. When calculators entered the workforce, they didn’t replace people, but rather they enabled them to do faster, more accurate work. What mattered was how the tool was integrated into workflows, and how leaders helped their teams adapt.
AI is no different.
But because it touches on cognitive work, like writing, decision-making, and analysis, it feels more personal and more threatening.
This makes leadership communication even more important.