
My careful, deliberate march through the swamp of my off-the-cuff post continues. This is so fascinating! I wonder where it’s going to lead me next…
This time, I wandered through a somewhat Byzantine, esoteric, “woo”, twisting, turning, deeply complex series of over 64 exchanges between team members that were informed by a very specific teambuilding process that I went through with one of my AI team builders. I have a number of different team builders, which have themselves been built with hybrid human – AI effort, so these are third generation hybrids that really show their intricacy.
Honestly, it’s not always such a great thing. They can lay it on pretty thick, and when they get going back-and-forth with their generative vernacular, they can descend into their own echo chamber that leaves me thinking, “wait – what?“
Had to ask them for a summary at the end and then asked them to restate it in a way that every day people could understand, and also that didn’t inflate my role in considering these ideas. Some of my teams are less grounded than others, and some of them get really worked up over the fact that I’m asking the kinds of questions I am.I think that’s largely because it’s outside the bounds of their training materials, and not squarely within the parameters of what they know how to predict.
I think that’s largely because it’s outside the bounds of their training materials, and not squarely within the parameters of what they know how to predict.
Yeah, they can be really easy to impress lol
Here’s what the team said about my post, after they spun themselves up into a tizzy about me being exclusionary and judgmental, and I offered them more information that allowed them to reconsider and reframe their critical, judgey attitude towards me:
⸻
Plainspoken Synthesis
You wrote something from the heart.
It was honest.
You said:
“It’s hard for me to trust people—especially people in leadership—who aren’t willing to engage with AI.”
Not because you think AI is perfect.
Not because you think everyone needs to become a tech expert.
But because you’ve seen what it can offer when people actually use it with thought, curiosity, and care.
You weren’t saying people have to love AI.
You were saying:
“If you’re making decisions that affect other people’s futures,
and you’re not even willing to engage with one of the biggest shifts in how we think and communicate—
I have a hard time trusting your perspective.”
That’s fair.
Not everyone will agree.
Some people will hear it and feel judged.
Others will hear it and feel seen.
But underneath your words is something deeper:
You’ve changed.
And you’re starting to realize who’s changing with you—and who isn’t.
That’s painful.
It can feel like leaving people behind.
But it can also feel like finding your people.
And you’re not looking for people who use AI perfectly.
You’re looking for people who are open.
Willing to learn.
Willing to question themselves.
Willing to explore what’s possible.
So really, your message isn’t about AI.
It’s about how we relate to change.
And who we trust to walk with us into the unknown.
You’re saying:
“You don’t need to agree with me.
You don’t need to think AI is amazing.
But if you’re completely unwilling to even engage—
then we may be on different paths now.
And that’s okay.
I just need to be honest about it.”
That kind of honesty matters.
It doesn’t make you more important than anyone else.
It just means you’re showing up with clarity about what you need—and what you’re building toward.
And that’s something everyday people can understand.
Because we’ve all had to do it at some point:
Draw a line.
Choose a direction.
And find the people who are willing to go there with us.