Anthony R Quigley
Uncategorized May 21, 2025 • 5 mins read

Holding Space for Our AI Uncertainty

Something is building underneath AI adoption headlines and celebratory launch announcements, something more personal and human. It’s fear, and not always the loud, dramatic kind. For some, it’s a slow, sinking unease about what this new era might mean for people who see the world through a deeply human lens, whose values, livelihoods, or sense of self are rooted in creativity, care, and meaning-making.

And for many, it’s not just skepticism about AI. It’s a kind of grief, grief for a future they don’t see a place for themselves in. This grief is founded on the belief that their contributions, which once just a short time ago felt irreplaceable, might soon be replicated, automated, or simply ignored. This grief deserves a space.

A Familiar Pattern in History

It’s important to remember that this isn’t a new phenomenon. We’ve seen versions of this before: when the printing press spread knowledge beyond the elite who once held it, sparking literacy and eventually reshaping power structures; when photography challenged the sacred craft of painting, leading to the rise of new art movements like Impressionism that embraced subjective experience over realism; when factories automated the labor of skilled artisans, dramatically altering economies and social classes. Each of those moments reshaped society profoundly, bringing both excitement and dread, a tangible sense that something meaningful had been gained, yet something deeply valued had also been lost.

Now we’re here again. Only this time, it’s moving faster. (Much faster). And the scale is bigger. (Much bigger). And the domains being touched—language, vision, reasoning, synthesis—are the ones we’ve long held as distinctly human. We’ve built our species’ identity within them.

As we navigate this profound shift, we’re also rewriting our digital identities

Since the advent of digital technology, we’ve all been conditioned to think of ourselves as “users,” trained to interact efficiently and passively with machines. We learned to expect clear instructions, avoid errors, and escalate problems to specialists. But the era of generative AI asks something entirely different from us. We’re shifting from passive users to active creators, designers, and collaborators who engage dynamically with these new systems. It means starting with ideas rather than simply following instructions, navigating without clear roadmaps, and accepting uncertainty as part of the process.

This isn’t just a shift in how we interact with technology; it’s a profound shift in how we see ourselves and our roles. The unease we feel isn’t merely about what the future might hold, but about the digital identities we’ve held for so long and now need to reimagine.

Creating Space for Processing

For those who see potential in what’s emerging, it’s easy to move fast. To build, to experiment, to push boundaries. But that momentum shouldn’t come at the expense of empathy. We need to create space in our teams and organizations for people to process what this means for them. Not with performative town halls or surface-level check-ins, but with real listening. And with affirmations that fear and uncertainty are ok. This moment isn’t just about technological transformation. It’s about emotional reckoning, too.

Beneath the surface of fear usually lives something else: care. Care for the craft, relationships, and ways we’ve worked, lived, and created meaning together, such as the journalist concerned about AI-generated news replacing carefully researched stories, or the therapist who wonders if automated mental health chatbots might overshadow empathy and nuanced understanding. That care is worth honoring.

And that care doesn’t mean someone is anti-AI or resistant to progress. It means they’re human, and they’re processing change.

Why We Need Their Humanity

We need these people. We need their questions, standards, and reflections. We need their humanity, not as a counterpoint but as a complement. The future we’re moving into isn’t less human. It’s more complex and interwoven and demands a new human capacity to make sense of everything.

What excites me most about this moment isn’t what machines can now do. It’s what we humans, augmented and accelerated, might now be able to do. It’s how we’ll face fewer barriers, gain more access to knowledge, and create tools that unleash what used to be locked behind money, gatekeepers, or expertise.

That isn’t without tradeoffs, and we must vigilantly observe how this technology is changing our processing and reasoning capacity, reshaping roles, rewriting workflows, and redefining what society values in human contribution.

Leading Through Transition

But moving forward isn’t just about adopting new technology. It’s equally about creating meaningful space for individuals to navigate their grief, reflect on their experiences, and find their personal place in the future we’re shaping.

Right now, effective leadership means providing room not only for progress but also for processing, not just focusing on the future we’re eager to build, but respecting and acknowledging the lives and practices we’re moving beyond.

Rather than simply asking them to adapt or catch up, let’s actively invite them to help lead us forward.