Between Algorithm and Presence: The Future of Palliative Care

A Case That Stayed With Me

During a recent clinic, I was asked to urgently see a 65-year-old woman with metastatic lung cancer after multiple lines of therapy. For weeks, she had been declining: confused, exhausted, not eating, losing weight, and no longer sleeping. Her oncologist had decided that further cancer-directed therapy wasn’t feasible. When I walked into the room, she was in her home clothes, pacing, sitting down, standing up again, repeating, “I want to go home.” Her husband was overwhelmed. He had been told more treatments existed if she got stronger, but she couldn’t even eat. Her son-in-law sat quietly in tears.

With a skilled RN and social worker, we assessed her confusion and restlessness, talked through her prognosis, navigated her husband’s hope and her son-in-law’s grief, clarified goals of care, recommended hospice, adjusted pain medications, added something for restlessness and sleep, ordered medications for dyspnea and anxiety, and coordinated same-day hospice admission. Weeks like this are not rare. We have several of these visits: high-stakes, emotionally charged, and often chaotic. These encounters are often preventable with earlier planning, but not all of them. When they happen, they are profoundly human, unfolding in the space where medical care meets what matters most to people.

A Radical Challenge to Our Assumptions

Charlotte Blease’s book, Dr Bot: Why Doctors Can Fail Us – and How AI Could Save Lives, makes a bold and radical argument: maybe the only way to humanize healthcare is to take doctors out of the center of it. She builds her case from patient advocacy. She describes, in painful detail, how medical systems fail patients, especially the most vulnerable, through discrimination, limited access, and appointments that prioritize efficiency over presence. She gives voice to people who felt abandoned by healthcare and who turned instead to technology that met them without judgment, in their own language, any time of day.

And she points to evidence. In text-based interactions, AI chatbots are often rated as more empathetic than clinicians. When people don’t know they’re talking to AI, they report feeling more heard.

But Blease goes further. She argues that, by virtue of being human, clinicians cannot consistently offer what patients need. We carry biases. We get tired. Our emotional bandwidth is limited. We’re “whirling dervishes,” trying to diagnose, document, and connect at the same time. It’s no surprise that both our thinking and our compassion sometimes falter. Blease writes: “…the human mind is mismatched with modern clinic environments.” Her conclusion: we are asking human beings to do something we aren’t structurally equipped to do at scale.

A Pivotal Moment in Cancer Care

Her critique lands at a crucial moment in cancer care.

The human crisis. The Lancet Oncology Commission recently described modern cancer care as lopsided, built around technical excellence rather than human presence. The result is what they call an “erosion of meaning, connection, and compassion.” Patients often feel unheard, reduced to numbers, their distress overlooked. Suicide risk is on average 26% higher for people with cancer. The Commission describes a “global epidemic of unnecessary suffering,” not because treatments are lacking, but because humanity is. (HT to Prof. Dr. Jan Gärtner for pointing me to the article).

The reckoning in palliative medicine. Our own field is entering what some call “Palliative Care 3.0.” The first phase built the model. The second grew it. The third must standardize quality, improve equity, and integrate deeply into healthcare, all while technology reshapes everything around us. This comes as we face uncomfortable truths. Ira Byock recently wrote that hospice and palliative care are “faltering,” facing “existential challenges,” and that we have “not in any regard succeeded” as a public health strategy.

The numbers are hard to ignore. About half of the hospices that receive star ratings have three stars or below. Many hospitals offer “palliative care light,” with untrained staff following checklists and nurses stretched impossibly thin. When patients finally get referred, the median time from referral to death is 20–30 days, too late to prevent a crisis or preserve dignity. Most dying Americans do not receive timely, comprehensive, or guideline-concordant palliative care.

Meanwhile, patients turn to AI at 2 a.m. because their doctor isn’t available, their nurse is too busy, and their fear feels unbearable. Blease sees this as proof that her vision is already coming true: patients finding support where the system has failed them.

What the Palliative Care Model Actually Is

But here’s what Blease’s argument misses: palliative care has never been built around the doctor in the center, working alone.

When I work with patients with advanced cancer, I never do it alone. Our team includes nurses who talk with patients at home and become their first line of support. Social workers who coordinate hospice, navigate insurance, and hold families through complex dynamics. Chaplains who accompany spiritual and existential questions. Nurse practitioners who are available around the clock for crises. And hospice teams that include home health aides and bereavement counselors.

This is not a doctor with helpers. Each discipline brings something essential. Nurses know symptom trajectories better than anyone because they see them every day. Social workers understand family systems. Chaplains sit with questions I’m not trained to hold.

We have never claimed physicians are the sole source of healing. Dying is too complex for that.

We also now have decades of evidence showing that presence is not a sentimental add-on. It is a clinical intervention. The Lancet Commission defines human-centered care as recognizing people in their full social and existential context, with dignity, respect, and self-determination at the center. Compassion, timely information, and attentive listening all predict better outcomes. And empathy, importantly, depends on presence, not the length of the visit.

I think of a patient I’ll call Marie, a woman in her 30s dying of cancer. When I walked into her hospital room, she lay connected to IV lines, motionless, her face soft and childlike beneath a crown of blond curls. Her mother and a friend who interpreted sat nearby (English isn’t Marie’s first language). Marie understood she was dying. Her pain was controlled. Hospice was arranged. But her mother and sister were struggling. They wanted labs, transfusions, anything that felt like action. The idea of sitting at home watching her decline felt unbearable.

As we talked, something shifted: not dramatically, but perceptibly. For an outside observer, it might have looked like an ordinary conversation. But it wasn’t. I wasn’t simply providing information (AI could do that). I wasn’t only offering empathy (others can do that well). I was integrating clinical judgment: she is dying, these interventions won’t help, with the emotional reality of a family in anticipatory grief. I was helping them move from fighting the cancer to accompanying Marie through dying.

This integration – the clinical with the relational – is precisely what the Lancet Commission says is missing from modern cancer care.

How Technology Can Strengthen What We Do

None of this means technology has no place. It does – and a big one. AI could offer 24/7 symptom guidance in a patient’s own language, help track medications, provide real-time translation, triage concerns between visits, and cut documentation so clinicians can spend more time with patients. Health coaches could reach communities we have failed to serve. Peer supporters could normalize conversations about dying and help families feel less alone.

This isn’t a replacement. It’s evolution, if we design it well. The real question isn’t “technology or doctors?” The real question is: How can we use technology to ensure more people receive timely, high-quality palliative care?

What Oncologists Need, and What AI Cannot Do

When oncologists send a patient to palliative care, they need someone who understands the medical trajectory – what dying from this cancer looks like, what treatments still matter—and who can also help families navigate the emotional and existential terrain. They need someone who can tell the truth gently, support a family through it, and coordinate a team that provides continuity in the most destabilizing moment of someone’s life.

Can that be split up – AI for information, health coach for empathy, peer supporter for understanding, physician for prescriptions? Anyone who works in this space knows the answer. What these situations require is integration. Someone who can read the room, understand that the confusion signals dying rather than reversible illness, sense that the family isn’t ready for blunt language, and recognize the practical urgency: she needs to be home today, comfortable, with the right medications.

Blease calls doctors “whirling dervishes.” Sometimes patients need exactly that—someone holding multiple dimensions at once, not because fragmentation is inefficient, but because dying is that complex.

And this integrated approach is exactly why early palliative care improves outcomes while lowering costs. It works not in spite of human-centered care, but because of it.

The Conversation We Need to Have

Blease pushes us to prove our value with evidence, not sentiment. Byock reminds us that we are not reaching most people who need us. Together, they ask the question we can no longer avoid: In an age of AI, what does palliative care uniquely provide – and are we delivering it?

The answer lies in what we have always done: integrated, interdisciplinary care that treats dying not just as a medical problem but as a human experience requiring expertise and presence. But we must do it better: more equitably, more consistently, and in partnership with technologies that can extend our reach.

So I’ll end with two questions:

For patients and caregivers: What technology has actually helped you feel more supported? For oncologists: When you refer to palliative medicine, what do you most need us to provide? For palliative care colleagues: How is technology strengthening your work?