Pauline Vuyelwa Muswere-Enagbonma, Chief Executive at domiciliary care agency Jessamy Platinum Care Group, argues that lived experience must shape the ethical design and use of AI in care.
Social care has never been short of innovation. What it has been short of is dignity — delivered consistently, at scale, in the messy reality of people’s lives. That is why I approach artificial intelligence (AI), data and design with conviction — and caution. As a disabled leader and mental health advocate, I have lived what happens when systems are built around assumptions rather than people.
AI in care is neither miracle nor menace. It is a multiplier. It amplifies what we already value, what we measure, and what we are willing to ignore. That makes the real question not what AI can do, but what we choose to prioritise.
“Technology will not humanise care unless we first humanise the way we build technology.”
The promise is tangible: better continuity, fewer missed risks, more time for relationships, and clearer oversight where services are stretched. The peril is quieter: believing a model’s “insight” over a person’s truth; mistaking efficiency for effectiveness; confusing data completeness with understanding.
My lived experience has trained me to notice what never appears on dashboards: the harm of being misunderstood, the fatigue of repeating yourself, the damage of being treated as a case rather than a human being. That is where ethical AI must begin.
“Ethical by design” often means policy statements, fairness checklists, or governance committees. These matter. But without lived experience at the table, something critical is missing: accountability to real-world consequences.
Lived experience is not consultation. It is expertise. It shows us what fractures dignity in practice. It challenges habits such as:
- Building for the “average” user when care is rarely average
• Labeling “non-compliance” without examining context
• Treating consent as a one-off transaction rather than an ongoing relationship
Across our ecosystem work — homecare, supported living, workforce development, and the JessamyCareOne platform — we treat co-production as design discipline, not branding. We ask a simple question: who is most likely to be harmed if this is wrong? “If you are not designing with the people most affected, you are designing against them.”
Predictive tools can support earlier intervention in safeguarding, falls prevention, or relapse patterns — but only if they prompt curiosity, not certainty. The risk is not prediction itself. The risk is what happens when people are reduced to risk scores and staff to compliance operators.
That’s why guardrails matter: transparency, plain-language explanation, the right to question, and professional judgement that is central — not symbolic.
Dignity is control, voice, and being believed. Digitally, that means:
- Consent that is informed and revisitable
• Privacy that works in practice
• Accessibility by default
• Data collected only for clear care benefit
• A documented human override
If a system cannot be explained simply, used safely under stress, or adapted to individual need, it is not fit for care, however impressive the analytics.
Innovation can strengthen the workforce if it restores pride and reduces grind. Young professionals stay where they can practise skilled, relational work without drowning in duplication. But if digital tools intensify surveillance or sideline judgement, they will deepen burnout.
The right digital future supports excellent care and makes it easier to evidence — without turning compassion into a metric.
Technology can assist memory, connection, and continuity after bereavement. But grief and healing remain relational acts. We heal because we are witnessed, not because a system has processed our loss.
“Beyond Digital” is a leadership stance: technology as infrastructure for dignity, not decoration, not distraction, and not a substitute for care. It is about designing for the hardest days and the quietest voices. When lived experience leads the ethical use of AI, data, and design, we do not simply modernise care — we make it more just, more inclusive, and more truthful.






