Making Engagement Meaningful: Beyond Tick-Box Consultation
“What’s the point of filling in this survey? They’re going to do whatever they want anyway.”
If you are the “they” in that sentence - responsible for town plans, regeneration programmes, public investment, or statutory consultation - this sentiment will feel uncomfortably familiar.
I once had a gentleman tell me, very firmly, that there was no point talking to me at a public engagement event because “they wouldn’t listen”. After some time, I managed to convince him that I was, in fact, the “they” he was referring to. He became so overwhelmed by this revelation that he completely lost his train of thought about what he wanted for his local market and wandered off.
Fifteen minutes later, he returned, beaming.
“I know what I want,” he said proudly. “I want it to be better.”
Problem solved. With the entire future of the market now clearly defined, he strolled off smiling.
That exchange sticks with me because it neatly captures the core challenge of public engagement: people often feel unheard, cynical, or disengaged - and yet they do care. The problem isn’t apathy. It’s how engagement is designed, delivered, and used to extract meaningful conclusions that can be acted upon.There is no single right answer
The range of options is broader than many assume. A council might retain direct control, establish an arm's-length company, transfer the asset to a community organisation, sell it outright, or create a hybrid arrangement that blends several approaches. Each model carries different implications for risk, revenue, accountability, and long-term flexibility.
The choice depends on what the building is for, what the council wants to achieve, and what capacity exists - within the authority and beyond it - to make it work.
The three big problems with public consultation
1. Getting people to engage at all
Persuading people to stop on the street, click a survey link, or do anything other than leave a (usually unpleasant) Facebook comment is remarkably difficult. As a result, engagement exercises often hear repeatedly from the same voices: the most aggrieved, the most politically motivated, or those with a very specific agenda.
Add any friction at all, extra clicks, long introductions, mandatory “I agree” buttons, and most people are gone. Attention spans are short. If engagement feels like hard work, 99% of potential respondents will scroll on.
2. Drawing genuinely meaningful conclusions
The man on the street wasn’t wrong to be sceptical. It is extremely easy to interpret consultation findings in a way that supports what you already believe, particularly when working through large volumes of qualitative data.
Good engagement needs to be actively designed to counter this. It should be unbiased, draw from a wide range of sources, reach a broad demographic, and be anchored in both quantitative and qualitative evidence. Without that discipline, consultation becomes little more than confirmation bias dressed up as insight.
3. Getting answers that lead to real action
Ask people what they want, and the most common response is: “better” or “more”.
I’ve seen hundreds, possibly thousands, of surveys where those words dominate the free-text responses. Without a well-structured questionnaire, respondents rush through with vague answers, especially when the survey feels endless or unclear about what’s being asked of them.
As a citizen, I’ve been guilty of this myself. (Though, because of my day job, I usually grit my teeth and persevere.)
Tick-box engagement
When engagement doesn’t address these issues, it becomes performative. A survey sits on a council website for a month. Fewer than a hundred people respond. A report is produced, heavy on demographic charts and light on conclusions, and quietly filed away.
That’s tick-box engagement.
I’ll be honest: I’ve delivered a couple of these myself earlier in my career. I’m not proud of it, but most people in this sector will recognise the scenario, often when a direction is already fixed, or when a rigid engagement methodology allows no room for nuance or creativity.
The consequences of this kind of engagement are rarely immediate, but they are significant.
Poor engagement leads to weak evidence bases, which in turn lead to fragile decisions. That can mean investment that fails to land, proposals that generate unexpected opposition, or plans that technically comply with the process but lack genuine public legitimacy.
For local authorities, the risks are both practical and reputational. Schemes are delayed or diluted. Officer time is consumed responding to objections that could have been anticipated earlier. Trust erodes, making future engagement even harder. In some cases, consultation outcomes are challenged precisely because they are seen to have been designed to validate a predetermined position.
So what does good engagement look like?
This will always be subjective, so I’ll be clear: this is what good engagement looks like to me.
First, it means hearing from a large number of people across a genuinely broad spectrum of the public. Public survey responses consistently skew towards older women - draw your own conclusions about behavioural patterns - so good engagement actively works to broaden participation across age groups and communities.
For some purposes, such as establishing baseline behaviours or undertaking early fact-finding, a survey alone can be sufficient. It provides a quick temperature check and helps identify broad positives and negatives.
But for engagement intended to inform local priorities, town plans, or major regeneration proposals, surveys alone are not enough.
In-person conversations matter. Standing on the street talking to people gives you insight that no dashboard ever will. But timing matters too. Saturday afternoon footfall looks very different to Tuesday morning. Good engagement captures both, and good engagement consultants are willing to work weekends to achieve it.
Surveys and public drop-ins together form the core evidence base. But where decisions really matter, that still isn’t sufficient.
Reaching the people no one asks
A robust engagement process must also reach those often labelled “hard to reach”, usually meaning people no one has quite worked out how to involve, or hasn’t prioritised.
Third-sector organisations, local business owners, school and college students, representative and advocacy groups - the relevant stakeholders will vary by place and project. Engagement here doesn’t have to be complex: informal conversations, focus groups, round-table discussions. But the insights gained are often disproportionately valuable.
Turning engagement into something useful
Good engagement doesn’t end with data collection. It ends with clarity.
The outputs should present clear, actionable conclusions, supported by evidence and framed within a coherent narrative. Where findings are conflicting or ambiguous, that should be acknowledged and explored, not smoothed over.
Engagement documents do not need to be dry, impenetrable reports full of charts and statistics (although they must be grounded in them). They need to be readable, credible, and genuinely useful for shaping next steps.
How we deliver this in practice
The most effective way I’ve found to generate high survey response rates is simple: put real budget behind targeted, sponsored social media posts, segmented by postcode, with a single click-through to a straightforward survey.
If respondents have to read long introductions, accept terms, or navigate question-by-question pages, they’re gone. Transparency matters. I’m a strong advocate for putting all questions on a single page, clearly showing people what they’re being asked to do. Response rates are consistently higher.
For a mid- to large-sized town, 1,000 responses is my benchmark. Some projects will inevitably fall below that. Market halls, which I work with most frequently, tend to generate particularly strong engagement because people care deeply about them.
Alongside surveys, meaningful engagement requires proper groundwork: understanding the place, identifying the right stakeholders, and then doing the legwork. That usually means several days on the ground running focus groups, hosting drop-ins, visiting businesses, and speaking to people one-to-one, at times that work for them.
The analysis is often the hardest part. We all have biases. That’s why a clearly structured, transparent analytical approach is essential.
Used properly, AI tools are increasingly valuable here, particularly for sentiment analysis across large volumes of qualitative responses. When carefully prompted, manually checked, and audited by humans, they allow far richer analysis than cherry-picking a handful of quotes that conveniently support a predetermined view.
AI makes people nervous, and rightly so when it’s used badly. Poor prompt design leads to odd conclusions and hallucinations. The solution isn’t avoidance, it’s competence. Learn how to use it properly, or beware.
Leases and operator agreements
Between full ownership and outright sale sits a range of arrangements where the council retains the asset but hands operational responsibility to a third party.
A commercial lease to a private operator can generate income while transferring day-to-day management risk. A management agreement can bring in specialist expertise - a heritage trust, a hospitality operator, a workspace provider - while keeping the council in the driving seat on strategic decisions.
These arrangements work well where the council wants to retain long-term control but lacks the capacity or appetite to run the building itself. The key is getting the terms right: rent levels, repair obligations, break clauses, performance requirements. A poorly structured lease can tie a council into arrangements that become unworkable as circumstances change.
And finally…
At Next Phase we design and deliver engagement that is proportionate, inclusive, and genuinely useful - not as a procedural requirement, but as a critical input into better decisions. That means combining strong survey design with on-the-ground conversations, thoughtful analysis, and clear, evidence-based recommendations that clients can actually act on.
Whether we are supporting town centre strategies, market investment, or wider regeneration programmes, our focus is always the same: engagement that builds trust, reduces risk, and leads to outcomes that stand up to scrutiny.
Yes, engagement reports will always include pages of detailed response breakdowns. They need to, that’s what makes them robust and defensible. But every section, and especially the executive summary, should translate evidence into clear, realistic recommendations that decision-makers can actually act on.
Otherwise, the man on the street was right.
What was the point?