Blog

Fluent in AI? Good. Curious? Better.

Written by Huibert Evekink | Oct 30, 2025 8:30:02 AM

 

Every week, a breakthrough. Tools that write, design, forecast, diagnose, and automate tedious tasks. We invest, we upskill, and we roll out AI initiatives. Yet the returns are underwhelming for the moment, with many teams reporting stalled pilots, marginal productivity gains, or a lack of meaningful ROI.

This isn’t a new story. Organizations have long struggled to find immediate practical applications for emerging technologies like computers and the internet. It took time, not just to understand the tools, but to reimagine how they could fit into workflows, culture, and cash. We’re in a similar phase now with AI: the tools are advancing, but the imagination and curiosity to use them meaningfully still lag.

 
AI fluency is not a sustainable competitive advantage
 Many organizations have mistaken access to AI as an advantage. Tools are getting faster, cheaper, and easier. But when everyone has access to the same tools, the explorer mindset becomes the only differentiator. Knowing how to ask AI for information or speed up a simple task is no longer rare. However, investing time and effort to explore AI's possibilities in depth is left to early adopters.

Curiosity is the psychological energy that drives exploration, testing, and learning; whether it’s a product manager iterating on a new user flow, a teacher personalizing a learning program, or an analyst probing unexpected data trends.

Many of the opinion makers and AI early adopters dominating the conversation tap into a natural curiosity. They often have technical backgrounds and are commercially invested in AI’s success. But that creates a blind spot: it’s hard for the curious to imagine what it’s like not to be. As a result, they often overestimate how easily others will engage and/or underestimate the support most people need to get started.

This is a problem because if we want widespread adoption, we can’t design for the few; we need to support the many. Most people don’t resist new technology because they’re lazy or uninterested. They resist because they lack the right psychological conditions. For them, curiosity is not innate and, therefore, must be nudged and nurtured to be activated.

Curiosity is not a given

 We regularly hear senior leaders in our workshops—often attended by cross-functional managers or department heads—open with lines like, “Be more curious.” Then the room falls silent, and people stare blankly at the floor. Because for most of us, curiosity can’t be commanded, it must be enabled.

While curiosity is a basic human drive present in varying degrees in everyone, only a distinct minority — 15% of people — are considered curious by nature, according to leading estimates from organizational and psychological research.* This group is characterized by a strong urge to explore, discover, and grow, exhibiting behaviors such as openness, adaptability, playfulness, and a proactive approach to new experiences.

Looking at new technology adoption, about 2.5% of the general population are considered “Innovators,” while another 13.5% fall into the “Early Adopters” category**. This means that roughly 16% of people are very curious and are likely to adopt new technology early. If we stretch the group to include the early majority, we still have about half of us who wait and see, needing more time, evidence, or support before engaging with new tools or ideas.

That’s not a flaw. It’s part of our evolutionary design. We’re wired for balance: we need explorers, builders, and stabilizers.

However, in a world shaped by rapid, continuous waves of technology, higher baseline levels of curiosity have become a vital survival skill. Those who can’t explore, adapt, or connect meaning don´t miss out because they lack intelligence; they’ll fall behind because they lack curiosity and stamina.

Two kinds of curiosity. Both are essential

Not all curiosity looks the same. In fact, the path to meaningful AI adoption beyond “Google search on steroids” depends on understanding two distinct types:

Aesthetic curiosity — playful, sensory, open-ended. The drive to see what happens. It sparks with a surprising prompt, summary, idea, a weird demo, or a question no one has asked yet.

Epistemic curiosity — cognitive, focused, structured. The prolonged drive to understand. It digs deeper, tests assumptions, and seeks to understand how things work and connect.

An AI prompt can spark both:

  • A marketer asks an AI chatbot to come up with some ideas for a tagline. That’s aesthetic curiosity.

  • A marketer tests the same prompt across three AI tools and notices big differences. Curious, they tweak the prompt and compare outputs. Then they study what makes messages persuasive—digging into research on framing. Combining what they learn with what the AI shows them, they craft a tagline neither could create alone. That’s epistemic curiosity: learning to develop with AI, not just react to it.

One lights the spark. The other builds the fire.

For a technology as abstract and probabilistic as AI, epistemic curiosity is the engine of co-intelligence. While openness to novelty can initiate the journey, it’s the intellectual drive to make sense of complexity that transforms initial play into a long-term human competitive advantage.

For teams and organizations, this means striking a balance between space for playful exploration and structured inquiry that leads to durable learning, insights, and innovation.

You can’t command curiosity. But you can design for it

 Even the most naturally curious person can lose the explorer mindset when placed in an environment that punishes questions, overloads attention, or fails to provide feedback. Curiosity needs to be supported, renewed, and protected over time, especially for less explorer minded individuals.

Curiosity goes flat when:

  • We’re judged or punished for asking questions or coming up with ideas.

  • We’re overloaded and starved for time and space.

  • We receive no feedback or outcomes from their ideas and experiments.

  • People can scatter their attention across too many directions at once. The real risk is losing the explorer mindset, not because curiosity is absent, but because it’s impatient, unfocused, and unaligned with purpose.

To build cultures of sustained curiosity, we need to remove these friction points. That requires more than space—it requires training, leadership, and systems.

So how do we design for sustainable curiosity?

  • Psychological safety: Make it okay not to know. Normalize asking “dumb” questions. Celebrate learning moments over performance outcomes.
  • Meaningful relevance: Tie (AI) exploration to real-world work. People get curious when it helps them solve a challenge they care about.
  • Learning how to learn: Epistemic curiosity isn’t just about asking more profound questions—it’s about knowing how to gain the knowledge needed to answer them.
  • Small wins, surprising sparks: Use unexpected demos, analogies, or results to open attention loops. Curiosity begins with surprise.
  • Clear next steps: After the spark, guide people from “what is this?” to “how does it work?” to “how can I use it?”
  • Progressive challenge: Keep people in the region of proximal learning—not overwhelmed, not bored. Curiosity peaks on the edge of capability.
  • Reflection and reinforcement: Help people track their own insight development. Make it visible when curiosity leads to impact.
  • Make it social: Curiosity scales faster through peer learning. Build shared rituals of discovery, challenge, and storytelling.
  • Norms and habits: Turn curiosity into “how we work here.” Reward thoughtful experimentation. Codify reflection. Make curiosity part of team identity.
  • Leaders go first: Curiosity won’t scale unless leaders model the environment they want to create. Even if they don’t feel innately curious themselves, their willingness to be vulnerable—to ask questions, admit uncertainty, and prioritize learning—sets the tone.

Final Truth: Curiosity is back to being a survival skill

 In stable environments, curiosity is optional. In fast-changing ones, it’s non-negotiable. It’s not about turning everyone into inventors. It’s about building curiosity resilience: the ability to stay engaged, open, and adaptable when things get technical, messy, or slow to reward.

In the era of intelligent machines, the value isn’t in having the best tools because everyone will have them. It’s not even just knowing how to use them. It’s in imagining what you can meaningfully improve with them. Those ideas and questions come from curiosity.

Fear can’t be the fuel. Many organizations try to drive adoption and innovation through urgency, anxiety, or even a subtle threat: keep up or be replaced. But fear drains creative energy. It narrows attention. It shuts down exploration. You may achieve short-term compliance, but you won’t foster long-term innovation.

Sustainable creativity with AI depends on fluency to get you started. However, it is human curiosity that will keep you engaged.

Thanks for reading futurebraining!