There’s a conversation that happens a lot in leadership circles just never out loud.
It goes something like this: “I should probably engage more with this AI stuff. But what if I ask a stupid question? What if everyone else already knows this and I’m the only one who doesn’t?”
So you stay quiet. You observe. You wait for the moment when you feel ready enough, informed enough, confident enough to participate.
I know this, because I spent six months in the U.S. waiting for exactly that moment and it never came.
What I Thought I’d Find
Over the past six months, I had the privilege of spending time in Massachusetts and Minnesota, and attending the AI Summit in New York. I went in with a clear expectation: that somewhere in this ecosystem which in my view is the most advanced AI environment in the world, there was a level of clarity that I was missing back home. A sort of playbook. A hidden layer where the people who really understand this would finally explain how it all fits together.
I had a bit of a Wizard of Oz moment in mind. I was going to peek behind the curtain and finally see who’s pulling the levers.
What I Actually Found
That didn’t happen.
Yes, the U.S. is ahead in many ways. The investment is real. The infrastructure is deeper. Companies can move faster, test more, and absorb mistakes in ways that smaller markets can’t.
But being ahead doesn’t mean having it figured out.
What stood out was how much is still being worked through in real time. Systems don’t scale the way they were expected to. Vendors promise more than they consistently deliver. Teams explore use cases that sound brilliant in a room and quietly fall apart when they meet real constraints.
And perhaps what was personally most relevant is that the people doing the most meaningful work in this space were not the ones with the most polished answers.
They were the ones most comfortable saying: “I don’t know. But I’m trying this.”
They were curious in a very active way. Not passive interest, real engagement. They would try tools, break them, rebuild things, change their minds, and do it again the next day. They talk about this constantly. To the point where the people around them will eventually ask to please, just once, talk about literally anything else. (That, I’d argue, happen to many of us already. A decent benchmark for genuine engagement :p)
What I realised is that what we call “expertise” here is mostly just exposure.
Not a hidden framework. Not a secret layer. Just time spent interacting with the problem, from different angles, over and over again.
The Conversation That Stayed With Me
Had many great conversations across this time. But one of the most valuable moments was a small setting, sitting down with someone who works closely with one of the leading consulting groups in U.S. healthcare. The topic drifted to small language models running directly on enterprise devices. Phones, laptops. Bringing AI closer to where decisions actually happen.
The argument was compelling. Reduce latency. Reduce dependency on central systems. Create something more embedded in day-to-day work.
I found myself pushing back, not on the idea, but on what sits underneath it.
Because from an architecture perspective, especially in regulated environments, that shift isn’t just technical. It changes where control lives. When the model is no longer centralised, you lose visibility. You can’t observe interactions the same way. You can’t enforce policies as consistently. A prompt doesn’t need to come through a clean API anymore, it can come from an email, a file, a copied piece of text, and find its way back into enterprise systems in ways that are very hard to detect.
What I brought to that conversation wasn’t a better model or a better tool.
It was a different constraint: in some environments, especially banking, you don’t start with capability. You start with control. And whatever you build has to respect that, even if it slows things down.
That perspective had value. Not because I knew more than the person across the table. This is clearly not the case because this person is brilliant. But because I came from a different environment, with different pressures, and I was willing to say what I saw.
That’s the thing about real conversations: you don’t need to be the smartest person in the room to add something worth hearing.
What This Means If You’re Holding Back
I used to think of places like New Zealand as being behind in AI. Less aggressive investment, fewer large-scale experiments, a stronger tendency to wait for things to stabilise before committing.
Now I see both sides.
That caution reduces risk. It forces clarity. It avoids wasted effort. The trade-off is speed of learning because in this field, learning doesn’t come from reading about it or waiting for best practices to settle. It comes from doing. From trying, failing, adjusting, and trying again.
And here’s what that means for you:
If you’ve been waiting until you know enough to engage ( I know I was) you’re using the wrong threshold.
The leaders I met who were creating the most value were not the most technically sophisticated. They were the ones willing to show up to the conversation before they had all the answers. They asked questions that others were too embarrassed to ask. They pushed back when something didn’t feel right, even without the vocabulary to explain exactly why. They brought their context, their industry, their constraints, their experience and trusted that it was enough to contribute.
And let me tell you, It was enough. Every time.
The Permission You’ve Been Waiting For
There is no moment where everything suddenly becomes clear.
There is no hidden layer where the real answers live.
There are just people who decided to engage with the problem instead of waiting for it to resolve itself.
You don’t need perfect clarity. You don’t need the full picture. You don’t need to be the most technical person in the room.
You need to bring what you already have, your judgment, your experience, your willingness to ask honest questions and step into the conversation.
Because the field isn’t looking for more experts.
It’s looking for more people who are genuinely curious and honest about what they don’t know.
That might already be you.
The only question is whether you’re willing to act like it.