One theory as to why NHS communicators do not always enjoy parity with other senior managers working in trusts is that, individually and collectively, the communications profession may not be doing enough to demonstrate the strategic value and impact of what senior communicators and their teams do. The responses to this section of the survey demonstrated significant variation in how much time, energy and focus senior communicators are putting into evaluation. This is often determined by how much capacity teams have, with impact assessment activities often sacrificed – or undertaken at a more basic level – when teams are short staffed and over-worked.
The responses can broadly be categorised into three levels of activity:
Routine key performance indicator collection and reporting
The majority of communicators routinely collect and report back key performance indicators that measure activity – for example, digital statistics, social media mentions and media coverage. These are most often reported back to their trust’s board on a monthly or quarterly basis, often with an associated dashboard. These approaches are on occasion supplemented with more advanced tools, such as annual staff surveys, communications and stakeholder audits, and formal assessment against strategic objectives – both overall or objectives set for specific campaigns
One communicator said: “We provide a constantly evolving dashboard of stats, feedback, and anecdotal/visual evidence (e.g. attendance rates at events/briefings etc.). Lots of digital focus so stats and impact are getting easier to run off and view quickly,” one said. Another commented: “We develop a quarterly communications dashboard which is reported to the executive directors and board. It maps and analyses our activity in terms of media, digital, brand and staff comms. We then also collate activity against five strategic priority themes, as agreed with the exec directors at the start of the year.”
Ad hoc evaluation and impact measurement
However, the level of evaluation activity described above is often beyond many communications teams, who lack the resources to do this. Many respondents said evaluation is one of the activities that is most easily overlooked when they are short of resources.
One communicator said: “Measuring and evaluation only takes place on an ad hoc basis. This is due to the low number of staff compared to the high volume of work – often this isn't seen as a priority. ”Another said: “This is our most challenging area. Change is happening to demonstrate RoI [return on investment] for every job we do.” More worryingly, one communicator said: “People in the NHS unfortunately do not understand and therefore value the importance of corporate comms, as so much of our work is hard to measure (qualitative not quantitative) and we never have the budget for proper evaluation to demonstrate this. Where we are understood and valued (e.g. by board members) they are worried about the perception of investing too much in comms as we are not frontline or clinical.”
Advanced forms of evaluation focused on measuring outcomes
A smaller number of communicators reported that they are trying to deliver more sophisticated impact assessment and evaluation of the work they do – for example, based on the changes in patient and public behaviour that their trust may be targeting, or how communications can contribute to improving the patient experience. A number of communicators cited formal evaluation frameworks that they are using to support this activity, for example the Government Communication Service’s evaluation framework and AMEC’s integrated evaluation framework.
One communicator said: “As part of how we work we establish KPI's for channels and activity based on strategic plan or objective. We monitor channel effectiveness, run staff and public engagement groups and use technology to review interactions. Official measures within the trust are mainly staff survey engagement and CQC feedback. We demonstrate through keeping our profile high and sharing our impacts with relevant, mainly internal, audiences.”
Another commented: “I am steering the team towards evaluating what we do and trying to demonstrate how what we do impacts on the overall effectiveness of the trust. I do not want us to measure our success on how many press releases we issue or tweets we send but on the behaviour changes we can make happen and most importantly can we contribute to improving the patient experience.”