Most organisations struggle with employee feedback follow-through. They collect insight but fail to show what changes as a result. This piece explores why “closing the loop” often breaks down and what good follow-through really looks like in practice.
Over the past few months, I have noticed a familiar pattern.
There has been a noticeable increase in the frequency of pulse checks – shorter surveys, quicker turnaround times, and a growing reliance on dashboards that promise near real-time insight. On the surface, it looks like a step forward. Listening appears more active, more embedded, more continuous than it has ever been before.
And yet, when you spend time underneath the data, the sentiment hasn’t shifted in the same way.
In one recent piece of work, an employee captured it more clearly than any dashboard could:
“We keep being asked. I’m just not sure anything changes.”
That is the gap and it is a significant one.
An observation
We were working with a leadership team as they reviewed the results of their latest employee feedback cycle.
There was, on paper, a lot to be encouraged by. Response rates were high, suggesting people were willing to engage. The themes that emerged were consistent and easy to interpret: communication felt uneven, decision-making processes appeared slow, and there was a general lack of clarity about where to go for answers.
None of these insights were especially surprising. In fact, most of them had surfaced in previous rounds of feedback.
To their credit, the leadership team responded quickly. They shared a summary of the findings, acknowledged the concerns that had been raised, and communicated a clear intention to improve communication going forward.
From their perspective, they had done what was expected. They had listened, responded, and moved forward.
In other words, they believed they had closed the loop.
Three months later, we returned to speak to employees again.
What we found was not resistance or cynicism, but something quieter and more telling. The same themes were still present. Not because they hadn’t been discussed internally, and not because leaders had ignored them, but because whatever action had been taken had not been visible in a way that people could recognise.
One comment, in particular, stayed with us:
“I know they heard it. I just don’t know what they’ve done about it.”
The quiet failure of “closing the loop”
This is where many well-intentioned efforts begin to fall down.
The issue is rarely a lack of intent, and it is almost never about whether feedback was collected or acknowledged. The real gap sits between hearing something and being able to demonstrate, clearly and credibly, what has happened as a result.
When follow-through is not visible, people will fill in the blanks for themselves. More often than not, they land on a simple conclusion: nothing changed.
Over time, that assumption begins to shape behaviour. Feedback becomes more cautious, more filtered, or in some cases, it stops altogether. Not because people no longer care, but because they have learned what happens or doesn’t happen ,when they speak up.
What good follow-through actually looks like
In practice, there are three distinct levels to responding to feedback.
Most organisations reliably reach the first. Some are able to move into the second. Very few, however, operate consistently at the third level and that is where the real difference is made.
1. Acknowledgement
At this stage, organisations communicate that they have heard the feedback. This might take the form of a summary, a leadership message, or a simple expression of thanks. It matters, but it is also expected. On its own, it does little to build lasting trust.
2. Action
Here, organisations begin to outline what they intend to change. There is movement, and often genuine effort. However, without clarity and visibility, much of this work happens out of sight, which means its impact is easily missed.
3. Proof back
This is where things begin to shift.
At this level, organisations do not just act—they make that action visible and explicitly link it back to what was said. They close the gap between input and outcome in a way that people can see and understand.
It is the difference between saying something has been done and showing that it has.
What would have changed the situation?
What is striking about the situation we described earlier is that it did not require a different survey, more data, or a more sophisticated listening tool.
What it required was something both simpler and, in many ways, more demanding: clarity.
Clarity about what had been heard, what was going to change as a result, and how that change would show up in day-to-day experience.
Imagine, for a moment, if the follow-up had been more explicit:
- That communication would not just “improve,” but that from a specific date, all team updates would follow a consistent format that everyone could recognise
- That slow decision-making would be addressed through a defined weekly forum, with clear ownership and accountability
- That uncertainty about where to go for answers would be resolved through the introduction of a single, visible channel
- And that there would be a commitment to return within a set timeframe to share what had improved and what had not
The feedback itself would have been the same.
But the outcome would likely have been very different, because people would have been able to see the connection between what they said and what happened next.
Why this matters now
This matters particularly now, because many organisations are doubling down on the frequency of listening.
There are more pulse surveys, more check-ins, and more structured opportunities for people to share their views. The intent is positive: to stay closer to sentiment, to respond more quickly, and to create a stronger sense of voice.
However, frequency on its own does not build trust.
In fact, without visible follow-through, it can begin to erode it.
Each time someone takes the time to give feedback and does not see a clear outcome, the next response becomes more considered, more cautious, and often less useful. Over time, the data may still arrive, but it carries less of the truth that organisations are hoping to access.
The real work of listening
It is easy to think of listening as the moment a survey is sent or a question is asked.
In reality, that is only the starting point.
The real work begins afterwards, in the discipline of taking what has been heard, translating it into meaningful action, and making that action visible in a way that people can recognise and believe.
And crucially, doing that not once, but repeatedly.
Because trust is not built simply on being asked for input. It is built on the consistent experience that speaking up leads to something tangible.
If people cannot see change, they do not necessarily become more vocal in their frustration. More often, they simply adjust their expectations and their behaviour.
They stop challenging.
They stop offering the full picture.
They begin to work around the system rather than through it.
And by the time the data begins to signal that something is wrong, many of the people who first noticed it have already, quietly, moved on.
Q. Are you closing the Listening Loop?
