
ORIENTATION: Why This Book Matters
There is a question that most organisations are not yet asking seriously enough. Not “how do we implement AI?” but “if AI handles more of the analytical work, what exactly are we asking leaders to do?” More Human is Rasmus Hougaard and Jacqueline Carter’s answer to that question. Their answer is both clarifying and uncomfortable.
Hougaard and Carter are not AI sceptics. They are not warning leaders away from technology. Their argument runs the other way: as artificial intelligence accelerates in capability, the case for distinctly human leadership becomes stronger, not weaker. But most organisations are not investing in human leadership in proportion to the technological acceleration they are pursuing. That gap is the central concern of this book.
The authors draw on decades of research through Potential Project, one of the world’s largest leadership development organisations. Their lens is not theoretical. It is built from work inside organisations navigating precisely the transitions they describe. These are companies where AI adoption is moving faster than leadership culture, and where the consequences are beginning to show as eroded trust, diminished presence, and the quiet withdrawal of human judgment from decisions that still require it.
More Human asks leaders to consider a specific reorientation: not whether to adopt AI, but how to remain genuinely human in the act of leading while doing so. That is a harder question than most leadership books attempt, and Hougaard and Carter take it seriously.
DISTILL - Core Ideas
The book’s core argument can be stated plainly. As artificial intelligence performs more of what leaders have traditionally done, including analysis, pattern recognition, and information processing, the remaining leadership work becomes more human in character, not less. Awareness, wisdom, and compassion are not soft virtues to cultivate once the real leadership work is done. In an AI-saturated environment, they are the real leadership work.

This is not a comforting message for leaders whose value has historically been tied to what they know or how quickly they can decide. It asks leaders to build an entirely different kind of authority. One grounded in the quality of their presence and the depth of their judgment, rather than the speed of their output.
What makes More Human particularly relevant for the Trust and Psychological Safety theme is its treatment of human connection as a leadership output, not a leadership byproduct. Trust is not what remains when technology does the work. It is what leaders must actively build, and it requires the very capacities that acceleration tends to erode.
DEEP DIVE
The book’s central framework rests on three interlocking capacities: awareness, wisdom, and compassion. These are not presented as character traits that leaders either have or do not have. They are presented as disciplines. Capacities that can be cultivated deliberately and that require deliberate protection from the forces that degrade them.
Awareness, as Hougaard and Carter define it, is not self-awareness in the narrow psychological sense. It is the capacity to observe clearly. To see how technology shapes decisions, how organisational dynamics shift under algorithmic pressure, and how one’s own cognitive and emotional state affects the people being led. A leader operating at high velocity and under significant informational load is not, by default, an aware leader. Awareness requires a kind of stillness that acceleration actively undermines.
Wisdom is the capacity to interpret complexity with maturity. It involves holding competing considerations, resisting the pressure of quick resolution, and making decisions whose consequences are understood beyond the immediate term. The authors are specific about what wisdom is not: it is not expertise, and it is not intelligence. An expert can be right and still unwise. Wisdom involves the willingness to act under genuine uncertainty without pretending that uncertainty does not exist.
Compassion, in this framework, is the most demanding capacity of the three. The authors draw a careful distinction between empathy, which is the experience of feeling what another person feels, and compassion, which they define as the capacity to act in the genuine interest of another even when it is difficult. Compassion is what prevents technological efficiency from becoming human indifference. It is also the capacity most directly threatened when leaders are under sustained pressure. Under pressure, leaders tend to optimise for results and manage for compliance. Compassion requires more than that.
Together, these three capacities create what the authors describe as a leadership mode that is not in competition with AI but in necessary relationship with it. The argument is not that technology is dangerous and humans must remain central as a counterweight. It is that the highest-order leadership functions, including building trust, sustaining meaning, and making genuinely good decisions in conditions of ambiguity, are precisely the functions that AI cannot perform and that humans must not inadvertently abandon in the rush to adopt it.
Hougaard and Carter are also clear about what makes this difficult in practice. The organisations most aggressively adopting AI are often the same organisations that reward speed, scale, and analytical output. In those environments, the space for awareness, wisdom, and compassion tends to contract precisely when it is most needed. The book does not pretend this is easy to reverse. But it makes a compelling case that leaders who do not address it will find themselves running increasingly powerful systems in increasingly hollow cultures.
DIAGNOSE
The diagnostic problem this book surfaces is one of misalignment between organisational investment and organisational risk. Most organisations accelerating into AI are investing heavily in the technological infrastructure: the systems, the data architecture, the algorithmic tools. Far fewer are investing with equivalent seriousness in the human leadership infrastructure required to guide those tools responsibly.
This misalignment produces predictable symptoms. Leaders who are increasingly reliant on algorithmic outputs begin to delegate not only analysis but judgment. The meetings get shorter. The decisions get faster. The explanations get thinner. And the people being led begin to experience something they often cannot name precisely but recognise immediately: the sense that the leader is no longer fully present, no longer genuinely engaging with them as human beings navigating difficult circumstances.
The consequence is not dramatic. It is quiet. Trust does not collapse. It recedes. Psychological safety does not disappear. It diminishes. The ideas that people bring to meetings become smaller and safer. The concerns that might have been raised are held back. The organisation continues to function. But it functions at a lower level of human engagement than it could, and the gap widens over time.
Hougaard and Carter’s diagnosis is pointed: organisations are not failing to adopt AI. They are failing to lead through it. And the difference between those two things is the difference between an organisation that uses powerful tools well and one that uses powerful tools in ways that gradually undermine the human conditions on which high performance actually depends.
DETAILS
The Awareness Capacity
The authors describe awareness as the foundational capacity because it precedes everything else. A leader who cannot observe clearly, including their own internal state, the dynamic in the room, and the ways technology is shaping the conversation, will struggle to exercise either wisdom or compassion with any reliability. Hougaard and Carter argue that developing awareness requires practice, not just intention. The specific practice they emphasise is mindfulness. Not as a wellness intervention but as a leadership discipline. The ability to be genuinely present in a conversation, rather than half-present while processing the next decision, is not a soft benefit. It is the difference between a leader who builds trust and one who steadily erodes it.
The Wisdom Capacity
Wisdom, as the authors develop it, has a specific texture. It is not the accumulated knowledge of a domain expert. It is the capacity to hold complexity without resolving it prematurely. To make decisions that account for what is uncertain, what is at stake for people, and what the longer-term consequences might be. In a world where AI can generate confident-sounding analysis at high speed, the temptation for leaders is to treat algorithmic output as a substitute for judgment. Hougaard and Carter push back on this firmly. The quality of a leader’s judgment is not in the speed of the conclusion but in the quality of the reasoning that arrives there. That reasoning must incorporate the human dimensions that no algorithm currently captures.
The Compassion Capacity
The book’s treatment of compassion is its most provocative section. The authors argue that compassion is not a personality trait but a leadership practice. It is precisely the practice most in danger of being squeezed out by the combined pressures of AI acceleration and performance culture. Their research finds that leaders consistently overestimate how compassionate they are and underestimate how much their people need them to demonstrate it. The gap is not one of character. It is one of attention. Compassion requires time to listen past the surface, time to understand what is actually being experienced, and time to respond to the human being rather than the performance issue. That time is exactly what most leaders report having least of.
The AI Leadership Paradox
The book’s deepest insight is what the authors frame as the AI leadership paradox: the organisations moving fastest into AI adoption are the ones that most urgently need strong human leadership, and are the ones least likely to be investing in it. Technological capability scales rapidly. Human capacity develops slowly. The organisations that recognise this asymmetry and invest accordingly are, the authors suggest, the ones that will not only perform better but sustain that performance. They will have preserved the human conditions on which genuine collaboration, creativity, and trust depend.
Presence as a Leadership Variable
One concept the authors return to repeatedly is presence. Not as a charisma question but as a trust question. Their research suggests that what people most need from leaders in high-pressure, high-uncertainty environments is not more information and not more direction. It is the experience of being genuinely seen by someone with authority. Presence is the quality of attention a leader brings to an interaction. It is what communicates that the person being led matters beyond their output. In an environment where AI is increasingly present in every decision, human presence is not a counterbalance to technology. It is the condition that makes the use of technology feel safe rather than dehumanising.
NICHE CAPACITY LENS
Leader’s Shelf Capacity: Human-Centred Leadership in an AI Context
Within the Leader’s Shelf leadership intelligence framework, More Human maps directly onto the capacity we identify as human-centred leadership. This is the ability to lead in ways that strengthen rather than diminish the human conditions on which high performance depends. What Hougaard and Carter add to this capacity is its specific application in an AI-accelerated context. The question is no longer whether to be human-centred as a matter of values. It is whether leaders have actively cultivated the awareness, wisdom, and compassion required to remain so as the pace and complexity of their operating environment intensifies. This book makes a compelling case that those capacities are not incidental to leadership effectiveness in the AI era. They are its core.
MICRO PRACTICES
The Presence Audit
Once a week, review the previous five days and identify the conversations in which you were fully present versus those in which you were physically present but mentally elsewhere. The ratio is data. If more conversations fall into the second category than the first, the question to ask is not how to manage your diary better but what is consuming your attention at the level that prevents genuine presence.
The Judgment Pause
When you receive an AI-generated analysis or recommendation, build in a deliberate pause before acting on it. Not to question the data, but to ask what the data does not capture. What is the human context here? What is at stake for the people this decision affects? What would someone with genuine wisdom see that the algorithm cannot? The pause does not need to be long. It needs to be consistent.
The Compassion Checkpoint
At the end of each week, identify one person in your team who you have recently managed rather than genuinely engaged. Not someone you have ignored, but someone whose performance you have attended to without attending to the person. Schedule a conversation with no performance agenda. Ask how they are experiencing their work right now. Listen past the first answer.
The Awareness Reset
Before your first significant meeting or decision each day, take two minutes. Not to plan, not to prepare, but simply to notice your own internal state. Are you calm or reactive? Curious or defensive? Focused or scattered? Awareness of your own state before you enter a room is not a mindfulness exercise for its own sake. It is what allows you to choose how you show up, rather than simply reacting from wherever you happen to be.
The Humanity Review
Periodically review the decisions your team has made in the past month that were substantially informed by algorithmic recommendation. For each one, ask where human judgment was exercised and where it was absent. Where might the absence of human judgment have produced a technically correct but humanly insufficient outcome? This review is not about finding failures. It is about building institutional awareness of where the human leadership layer is operating and where it may have quietly withdrawn.
REFLECTION QUESTIONS
Where in your organisation is AI adoption moving faster than leadership culture? What is the evidence of that gap, and who is responsible for closing it?
If the people on your team were asked whether you lead with awareness, wisdom, and compassion, what would they say? What is the gap between their answer and your own?
What does presence mean in your specific leadership context? When are you most present with the people you lead? What conditions make that presence difficult to sustain, and what would it take to protect it?
As AI takes on more of the analytical work in your organisation, what is your answer to the question: what is distinctly human about the leadership you are providing?
“The more intelligent our machines become, the more essential our humanity becomes.”
SOURCES
Hougaard, Rasmus, and Jacqueline Carter. More Human: How the Power of AI Can Transform the Way You Lead. Harvard Business Review Press.
Research and organisational data from Potential Project, including studies on leadership effectiveness, mindful leadership, and compassionate leadership across global organisations.
CLOSING SYNTHESIS
More Human does not make the case that AI is a threat to be managed. It makes the case that AI clarifies something that was always true but could previously be avoided: the most important things leaders do cannot be automated. The quality of trust between a leader and a team. The willingness of people to bring their best thinking, their honest concerns, and their genuine creativity to their work. The sense that they are being led by someone who sees them. These are not technology problems and AI will not solve them. They are human leadership problems, and they require human leaders who have actively built the capacity to address them.
What Hougaard and Carter are describing is a choice that every leader operating in an AI-enabled environment now faces, whether they name it as such or not. The choice is not between adopting technology and remaining human. It is between adopting technology without attending to what that adoption requires of leadership, and adopting it while investing in the human capacities that make powerful tools safe to use, meaningful to work alongside, and aligned with the interests of the people they affect.
The leaders who make the second choice will not simply feel better about their leadership. They will build organisations where trust holds, where safety to speak exists, and where people remain willing to bring their full capacity to their work. In a world of accelerating technological capability, that is not a soft advantage. It is the distinguishing one.
