News Opinion social care Technology

Designing for Dignity

Andrew Morley, Senior Practice Development Consultant at the Social Care Institute for Excellence (SCIE)

Andrew Morley, Senior Practice Development Consultant at the Social Care Institute for Excellence (SCIE), explores why co-production must shape the future of AI in social care.

As artificial intelligence becomes more visible in social care, one principle is becoming increasingly clear: AI must be designed with people, not simply for them. If AI is to enhance dignity, choice and control, it must be shaped by lived experience — from design and development through to implementation and ongoing use. Without that foundation, even well-intentioned innovation risks reinforcing the very barriers it seeks to remove.

Both the risks and opportunities of AI are already visible. The risks of designing AI without lived experience are not theoretical. Feedback from colleagues participating in SCIE’s AI Decision Making workshops has noted that transcription tools have, at times, inferred independence where none exists or softened language in ways that distort the person’s voice. Biases embedded in datasets can reinforce stereotypes or overlook groups who downplay their needs. These are not abstract concerns, but real consequences of systems built without the insight of those who use them. Once embedded, such distortions can influence decisions at scale.

When designed well, however, AI can strengthen independence and choice. In our work with a number of local authorities, practitioners have described how transcription tools free them to focus on people rather than paperwork. Others have noted how AI can support people to ensure that formal records express their views in their own words, improving the quality of assessments and conversations. Pattern-spotting tools can help identify early signs of carer strain or changes in wellbeing. But these benefits only materialise when AI is grounded in dignity and co-production, not efficiency alone.

Designing for dignity means that co-production must be the foundation of ethical and inclusive design. SCIE’s new report, Towards a National Care Service: raising national standards of care, evidences that the things that matter to people are often not reflected in the indicators used to judge system performance. Going forward, we must ensure that measures and metrics — particularly those embedded in technology solutions — reflect what matters to people who draw on care and support, not simply what is easiest to count. What we choose to measure ultimately shapes what we value.

Participants in SCIE’s Digital Programme on behalf of DHSC in 2025/26, exploring the practice, governance and ethical implications of AI use in adult social care, repeatedly warned that tools developed without lived experience risk unintentionally amplifying existing inequalities. This is particularly the case when datasets reflect only those who contact services or when tools misinterpret cultural nuance, tone or context. Co-production helps surface these blind spots early, before they become embedded in systems and harder to challenge.

Designing for dignity also means ensuring that AI strengthens, rather than erodes, people’s rights. This begins with clarity of purpose: AI should only be introduced where it genuinely adds value, not because it is novel or commercially attractive. It requires transparency about how data is collected and used, what choices people have and how accountability is maintained. It also depends on accessibility. Some of the local authorities SCIE has worked with have described the importance of communication that reflects local reading ages, languages and digital confidence levels.

Ethical AI cannot be delivered by any single organisation. Local authorities, technology partners, national organisations, universities, providers and lived experience groups all hold pieces of the puzzle. When these groups work together, they build shared understanding and trust and ensure that tools align with statutory duties and safeguarding principles.

The sector can take practical steps now to prepare for a future shaped by AI. Establishing co-production groups, reviewing consent processes and building staff confidence in critical thinking are important foundations. Mapping where AI can genuinely reduce burden — without compromising human connection — will be essential as tools become more embedded in everyday work.

Organisations like SCIE have a vital role to play in shaping a rights-based approach. By convening cross-sector dialogue, developing practical guidance and sharing learning from pilots, we can help ensure that dignity and lived experience remain at the heart of national conversations about AI.

If we get this right, AI can help create a future where people have more control, more voice and more confidence in the systems that support them.

Playbook

Shawbrook

Email Newsletter

Twitter