Four questions to help leaders think through the implications.
During a recent workshop, a senior executive said to me, “None of us knows how people will learn in this new era.” I hear this sentiment across industries. Leaders understand that AI is transforming tasks and workflows yet they are far less certain about how it is reshaping the processes through which people develop—how they gain expertise, build empathy, and form identity at work.
In human learning and development, the pace of technological change now exceeds our capacity to fully make sense of it. In such moments, certainty is elusive. What is possible, however, are conversations in which leadership teams ask difficult questions and listen openly to one another. I call these “sense-making conversations.” In my recent advisory and teaching work, I’ve developed four “provocations” to support these conversations, which I believe all leaders should be having. They are not answers but prompts—tools that can help leaders explore how AI is altering the conditions for human learning and growth in their organizations. What happens when AI makes the pathways to mastery disappear? I begin by inviting executives to describe their own pathways to mastery. In ways that are sometimes brutally honest, they talk of countless hours of practice, moments of intense insight, wisdom gained from failure, mentors who gave them feedback. It’s all experiential learning that shaped their expertise, resilience, judgment and their identity. We then consider what disappears when AI does this early work. A pattern I now see emerging is concern that AI “shortcuts” are disrupting these natural pathways of mastery and the desirable difficulties that produce expertise. “If my young analysts never struggle,” one senior banker said to me, “and put in those long hours that I did, will they ever learn to think?” Many worry the very experiences that shaped their own careers—the practice, the frustration, the honing of the craft—are at risk of being engineered away. If AI drafts the strategy memo, analyzes the data, or generates the first 20 ideas, what happens to the slow, demanding process of repetition and learning that once formed the pathway to mastery? The developmental implications are beginning to emerge. AI will undoubtedly accelerate learning—but accelerated learning is not the same as development. Acceleration increases output; development transforms identity. The two are not interchangeable. In these conversations, the goal should be to discuss how to keep human development at the center of organizational learning, rather than letting AI take over the experiences that build mastery. Will we preserve those experiences or fall prey to the tyranny of productivity? Are we drowning out calm? Here I ask executives to reflect on work lessons from the pandemic, when their teams adopted “pandemic tech,” such as Zoom, Teams, and digital-collaboration platforms. They tell me how these technologies enabled people to work continuously under extraordinary circumstances. But they also reflect on how this triggered unintended consequences: The number of meetings increased by 50%, workloads intensified, and time for deep, focused work contracted significantly. What began as a necessary technological solution quickly expanded into overload. The ease of convening people created more activity, not necessarily better thinking—and the space for reflection and deep work reduced. With AI we risk repeating that pattern at an even larger scale. If pandemic technologies expanded the volume of meetings, AI expands the volume of content, making possible more presentations, more reports, more drafts, much of it produced with negligible friction. Executives already describe seeing early signs of this. AI-generated slide decks and summary documents are multiplying, often faster than teams can interpret or prioritize them. As one executive put it to me, “We’re generating more but thinking less.” The hope was that automating knowledge work would leave space for reflective, creative work. The emerging reality is that this proliferation simply creates noise that crowds out the very conditions essential for learning: calm, reflection, and the space to think deeply. Executives increasingly describe days filled with AI-generated material but fewer moments available to interpret, integrate, or question. In these conversations the challenge is a frictionless flow of content that overwhelms attention and obscures insight. AI makes many things easier but at the same time can diminish learning and increase noise. The question should no longer be “Can AI do this?” but rather “Does this add value?” Are we dulling what makes us human? Executives frequently tell me that the capabilities that they value most in their employees are also the hardest to develop: discernment, intuition, moral reasoning, and, above all, empathy. Empathy has been shown to be a capability that can be built systematically through repeated exposure to emotional nuance, conversation, and the interpersonal work of navigating tension and ambiguity. AI is beginning to shift these developmental conditions. Leaders see that AI can already simulate aspects of cognitive empathy and approximate emotional empathy . Yet the behavioral dimension—caring enough to act—remains distinctly human. And, crucially, it is learned. What I hear most often from executives is not that they fear AI will replace empathy. It’s that they fear it will replace the contexts in which empathy is developed. Empathy grows through practice: reading subtle cues, managing conflict, engaging in difficult conversations, supporting a colleague under pressure, showing vulnerability in teams. Those are the friction points that test and strengthen emotional capability. As one executive said to me, “If AI handles the difficult conversations, how will people learn to have them?” Another reflected, “My managers let the tool interpret tone before they try to understand it themselves.” When AI intermediates human interaction, it reduces exposure to the experiences that cultivate empathy. The convenience that the technology provides strips away the very conditions—exposure, friction, and interaction—through which empathy, judgment, and relational capability are formed. In these conversations, therefore, the challenge is to identify, protect, and actively design those conditions. The question to ask is, In adopting AI, are we unintentionally ridding ourselves of the very learning experiences that build empathy? Are we eroding choice and identity? As AI becomes more deeply embedded in organizational workflows, from task allocation to recommendation systems, it is reshaping not only how work is accomplished but also how people make choices and learn. Many of the new AI-enabled tools nudge behavior, propose next steps, or automate decisions. In doing so, however, they are also stripping people of the capacity to reflect, to choose, and to take ownership of their decisions. Employees are in danger of losing their agency, in other words, which is the engine of human growth and development. Executives recognize this. In workshops, they describe development pathways that feel increasingly guided, with fewer moments that demand independent judgment. Their concern is developmental: What happens to personal choice, agency, and identity when the habit of self-authorship weakens? “If the system always knows the next step,” one executive asked me recently, “when do my people learn to choose for themselves?” In these conversations, the challenge is to design AI-enabled systems that preserve space for human choice. This requires deliberately retaining moments for reflection, decision-making, and exploration. The question, ultimately, is this: Will we cede agency to machines, or will we design for human authorship? . . . AI will transform work. There is no doubt about that. But people can—and should—determine whether AI transforms learning. That’s why provocations and the sense-making conversations I’ve described in this article are so crucial. We don’t know what the future has in store for us, but it’s time to start thinking hard about what we want. Most importantly, it’s time to ask: In an age of intelligent machines, how do we ensure that people continue to develop into their most capable selves?
United States Latest News, United States Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
6 Games You’ll Be Shocked to Learn Turn 10 Years Old in 2026Video games are released all the time, and most of them come and go without much fanfare. The rarest examples stick around for over a decade.
Read more »
Should Parents Take $120K to Work From Home or $240K to Work in the Office?Rosie Colosi lives in New Jersey and is a reporter for TODAY Parents. She has bylines in The Atlantic, The Week, MSNBC, and PureWow, and she has written 33 nonfiction children's books for Scholastic, Klutz, and Nat Geo Kids. Once upon a time, she played Mrs.
Read more »
23 Life-Changing Habits Women Learned From Other Women'I started almost two years ago, and it changed everything.'
Read more »
Report Card: Packers Grades from Season Changing Collapse in ChicagoThe grades are in, and they would have been much more kind to the Green Bay Packers had the game ended in 58 minutes. Instead, a few big plays at the end of the game are dragging them down.
Read more »
What Ohio State can learn from Miami's victory over Texas A&MOhio State will play the Hurricanes for the first time since 2010 as the Buckeyes look to begin their playoff journey.
Read more »
Can Americans learn to love tiny, cheap kei cars?President Trump recently embraced kei cars, tiny vehicles that are popular in Asia but hard to get in the U.S. Kei car enthusiasts are delighted — but doubt whether much will change.
Read more »
