Last month, we convened local authority officials from across the region to share their early work on AI in public services.
The 21 projects presented at our poster session aren’t theoretical innovation exercises. They’re responses to real pressures: residents interacting with fragmented services, vulnerable people at risk of falling through gaps in provision, and infrastructure under increasing strain. That infrastructure includes the digital systems that underpin public services, with further challenges in how to ensure this infrastructure is in the condition required to leverage AI in support of public services.
At the same time, we’ve been running public dialogues with Cambridgeshire residents about AI in local government. Listening to both conversations side by side shows there’s strong alignment on what matters, but also questions about how to innovate responsibly.
Taking a problem-led approach
Improving how residents experience services. Multiple councils are tackling the frustration that residents must navigate separate websites, phone numbers, and portals for different services. Want to update your address? Tell us separately for council tax, electoral roll, waste collection, housing benefit. The goal here is not only about digitisation, it is to move from “tool-centric” design to “task-centric” design. Closely related is making information accessible. This means addressing concerns about the reading level for different documents, digital literacy, and accessibility for people with disabilities or cognitive differences. Several projects are working on AI that could translate, simplify, or adapt communication.
Breaking down silos and enabling integration. “Enterprise AI” needs to deal with disconnected systems and departments that can’t share information, even when they operationally need to. Data exists in one part of the organisation but can’t be accessed by another. Some councils are exploring AI to enable cross-service data sharing and joined-up journeys. Others are working on the underlying infrastructure – standards, integration layers, governance frameworks – that would make this possible.
Reducing the burden of manual, repetitive work. Social workers spending more time on documentation than with families. Planning officers manually validating forms and cross-referencing policies. Procurement teams reconciling supplier data across inconsistent systems. Policy teams monitoring hundreds of publications for relevant changes. The projects here aren’t about replacing people; they’re about freeing staff from work that machines can do so humans can focus on work that requires judgment, empathy, and expertise.
Enabling earlier intervention and prevention. Several councils are exploring whether AI could help identify needs before they become crises. This might be roads that need maintenance before they fail, families who might benefit from early support, or people at risk of homelessness while there’s still time to help. The appeal is clear: early intervention is both more humane and more cost-effective than crisis response. But this also raises difficult ethical questions.
Types of challenge
These are not all the same type of challenge, and recognising the difference matters for how we support them.
First, there are solutions that already exist and need to be shared. Document summarisation tools, basic accessibility checkers, AI-assisted translation; there are areas where there are already established tools or emerging practices. The challenge isn’t invention; it’s implementation. Councils need guidance on which tools work, how to procure them, what governance frameworks to use, and how to integrate them with existing systems. These benefit from communities of practice that share insights into ‘what works’ in deployment.
Second, there are domain-specific challenges where AI might help. How do you analyse transport data to improve bus reliability? How do you help residents navigate complex eligibility rules for social care? How do you predict which sections of the highway network need maintenance? Here, success requires both AI capability and deep domain expertise. It’s not enough to be good at machine learning; you need to understand transport planning, social care pathways, or highway engineering. AI is a tool for solving domain problems, not an end in itself.
Third, there are systems and infrastructure challenges. Fragmented IT systems. Data silos. The lack of standards that would allow different councils’ systems to interoperate. The absence of integration layers that could connect new tools to legacy databases. These aren’t solved by one clever application. They require coordination across organisations, development of shared standards, and significant technical architecture work. Solving them could enable other improvements.
Understanding which type of challenge you’re facing shapes what support you need and what success looks like.
Meeting resident expectations
While councils were sharing their work, 95 Cambridgeshire residents were telling us what they expect from AI in local government. Their message: cautious optimism. They can see the potential for AI, but their support comes with conditions.
Focus on real-world needs. Residents understand councils face budget pressures; they are also watching whether AI is used to improve services or simply cut them. They want to see social workers freed from paperwork to spend more time with families. They want potholes fixed before they cause accidents. They want planning applications processed more efficiently. What they don’t want is AI deployed in ways that sees service quality suffering.
Keep humans in meaningful control. The clearest red line we heard was that AI should not make final decisions that directly affect people’s lives, particularly concerning vulnerable populations, financial matters, or access to services. AI can analyse data, flag issues, handle administrative tasks, but consequential decisions must be made by humans with relevant expertise and authority, and that human oversight must be meaningful. Staff need time, training, and permission to challenge AI recommendations.
Design for accessibility from the start. What about residents who are digitally excluded? Who struggle with forms? Whose first language isn’t English? Who have disabilities? Residents expect AI to make services more accessible, not less. That means maintaining alternative access routes (like speaking to a person), designing for diverse needs, and testing with people who face barriers.
Be transparent. Residents want to know when AI is being used, how it works, what data it uses, and who is responsible when things go wrong. They expect named accountability; a person responsible for each system, not just “the algorithm decided”. They also want to know what’s happening with savings. If AI makes a service more efficient, is that reinvested in improving provision?
Prove it works before scaling it. Residents want to see AI tested, proven to work reliably, and implemented incrementally in a test-and-learn approach. They want councils to learn from what works elsewhere rather than reinventing wheels. And they want contingency plans: what happens if systems fail?
Bringing these insights together with real-world examples of potential AI projects shows a range of tensions that public sector workers are dealing with in using AI. For example:
Predictive systems for vulnerable people. Using AI to identify families who might benefit from early support, or individuals at risk of homelessness, has obvious appeal. Early intervention can prevent crises and is often more effective than late response, but residents asked about the impact of this on people’s lives: Could predictions become self-fulfilling prophecies? Can AI really capture the complexity of people’s lives, or will important context get lost? Have the people being “predicted about” consented to this? Will it make vulnerable people feel surveilled rather than supported? This means AI applications in such areas require careful attention to consent, transparency, human oversight, and involvement of affected communities in design.
Customer contact and chatbots. Several councils are working on AI-assisted contact centres, seeing potential to answer routine queries and free staff for complex cases. The challenge is that what councils consider “routine” might not feel that way to residents. Participants in our dialogues observed that if someone is calling the council, they’ve usually already tried online resources and failed. They need a human who can understand their situation.
Integration versus privacy. Residents want joined-up services, but they also worry about data being shared inappropriately, or information being used for purposes they didn’t consent to. Solving this requires both technical solutions (such as data governance, access controls, audit trails) and clear policy frameworks about what data sharing is legitimate. It’s not an AI problem specifically, but AI systems that operate across service boundaries make it more urgent.
ai@cam’s Local Government AI Accelerator
We’re bringing these insights from councils and public dialogues into the design of our new AI for Local Government Accelerator, which we’re launching today.
Projects supported by the Accelerator will receive up to £25,000 for 6-12 months of proof-of-concept work. We’re supporting multiple types of work, exploring AI applications as well as being open to infrastructure development or enabling processes. We’re looking to support partnerships between Cambridge University researchers and councils, creating active collaborations that tackle challenges of both operational and academic interest. We’re embedding public dialogue, and planning continued dialogue to test ideas as they develop. And we’re supporting a community of practice in which councils and researchers learn from each other.
If you’re a University of Cambridge researcher with ideas about addressing challenges in local government – and you have or want council partners for collaboration – we want to hear from you. If you’re a local authority with problems you think might benefit from university partnership, please get in touch.
The ai@cam Local Government AI Accelerator is now open for applications. For those interested, please apply here.
*About this work: This post draws on the AI in Local Government Innovation Showcase (November 2025) and ongoing public dialogues conducted by ai@cam with Hopkins Van Mil.*