Joining the dots - using innovative, actionable employee insight to influence what employees think, feel and do

Posted: 17-09-2020

 

 

Innovative, actionable employee insight to influence what employees think, feel and do

 

Comms and engagement research, compounded by poor practice, has had some real flaws for years.

Asking the same question in the same way because that’s ‘what we always do’? Business Leaders and comms / engagement professionals deserve better from their investment.

In this session, Ghassan Karian will share case studies of how five fundamental principles and innovative solutions can deliver actionable insights for you and your leaders.

Get the behavioural science in, rethink the way you listen to your people and see results that reflect reality.

Joining the dots - Engage Internal Communication conference from Karian and Box on Vimeo.

 

Engage - Internal communication virtual conference.


Ghassan - transcript 17 September 2020


What I want to show you today is predicated on data, real data from 2.5 million employee’s contributions from the last couple of years in surveys across multiple sectors in financial services, retail, oil and gas and a range of sectors. And therefore, what I’ll be showing you talking you through is validated by tangible research in the how you do employee listening that adds real value to comms engagement and other professionals to support leaders.

So, as I said, this is about absolutely if you have crap questions you have crap answers and crap insight (pardon my French) fundamentally therefore any comms research, any engagement research has to nail the questions.

So, what I want to focus on, specially today, is how you use the right questions, how you use the right data to produce killer insight that gets actionable outcomes for you, your organisation and your leaders/That’s the focus. And the starting point is absolutely, yes getting the questions right. Understanding how people think when they complete surveys, how they complete research, quant, qual whatever it might be so critical. The use of behavioural psychology the way people think, the way people behave is fundamental. And getting the questions right, is important not just in terms of getting the data that is actionable and meaningful, it’s also avoiding the potential of getting biased data, getting survey results that would be worthy of North Korean kind of inputs. Because fundamentally the bias is so baked into the research that you’re not getting a true reflection of what going on out there in terms of what people are hearing how they are hearing it and so on. So, at best you’re going to get vanilla data that can’t be acted upon at worst, you’re getting a skewed bit of insight How do you overcome that?

You’ve got to get rid of the bias, you’ve got t get a sense of why are people thinking, as well as the what. There’s a really important focus on not just looking through the rear-view mirror. Most research we have seen over the past year, and we have also conducted some in the last years, because that’s been the industry standard, is looking backwards at what people think feel do, experience last week, last month and so on, rather than looking ahead and helping course correct. Using yes ongoing listening for comms and engagement purposes, but not throwing the baby out with the bathwater, the annual or biannual comms and engagement survey or annual census the audits once every so often, the deep dive, are still valuable and important. And critically the value of integrating data together to get a more broader holistic view. 

So, getting rid of the bias is critical and there’s great book by a man called Dan Ariely, who’s written (he’s a Harvard professor) he’s written about how you get by the way some people to answer questions. The book is called The (honest) truth about dishonesty, in business and there’s some really powerful studies that he’s conducted, and elsewhere some really powerful academic studies that shows, when people fill in surveys they often lie to themselves, when they’re being asked questions about what they know or what they do and it’s pre-baked in, the subconscious makes them answer questions in first person in a way that’s more positive than it would be if it was slightly different. So, I’ll give you the example. I’m sure you’ll have seen questions like – I know, or I understand the strategic priorities for my business. The values, whatever it might be.

The "I know", "I understand" question is heavily used in comms research and is often broadly wrong in the data that it provides. It often is 20 or 30 points more positive than if it was asked in the third person where you ask people "do people in your part of the business, are they clear on the priorities?”. Now, what people are doing is that they are answering that question in the third person from their own perspective, because of course, they haven’t gone out there and sought people’s views. So, you are getting a more objective view of what people know or do. Now, you can still ask questions in the first person when it relates to how people feel, what they believe, what they experience. But when you are asking them about the individual – the bias kicks in about what they know or do. You can ask it differently by getting to the heart o what thy know by using really powerful qualitative questions – like how would they describe the priorities or the business strategy, or whatever it might be to a colleague and using some of the latest technology to qualitatively analyse thematically grouped topics.

So, you are getting to the heart of what people are saying and being able to drill down. Another example of the bias, I’m sure you will have all seen the relationship question on the left – My manager does this, my manager does that. It’s a testing of the behaviours of how managers communicate with, engage and support their teams. But what we’ve fond there is a huge amount of bias. If you hate your manager, if you love your manager, you’ll answer all versions of that question in the same way. You’re not going to answer one positive and one negatively because in essence, most of the time, you have a strong or weak relationship with your manager and you’re going to answer those questions positively or negatively. This is proven in millions of bits of data; you get a consistent level of bias one way or another. To get round this you’re going to ask the question in a different way.

So “thinking about how my manager communicates with or leads my team, I want them to do more of, the following” The series of behaviours we have just tested in the previous formulation, but in a much more actionable way. Suddenly you are getting actionable insight and data not just statement that people either agree or disagree. And that’s really powerful. Those are two examples of how you can change the language in a question in the same in tent but get a much more actionable and objective view of what people are saying. The other thing is to look at why people are saying what they say. Now most organisations are running surveys that are a five-point agreement scales Strongly agree to strongly disapprove. And many of those questions – if you are good at surveys, the vast majority I guarantee you, are what people think, what people know, what people feel, what people do. What it doesn’t do it dig deeper into the why.

And some organisations will say, we now need to run focus groups or ask additional stuff to uncover they why but actually you’re wasting opportunities in a low-cost environment, when budgets are strapped for cash, you can do that within the context of the survey if you look at what Tesco and what many others will do. Where you have killer questions you want to dig deeper into, you ask the original question, the ‘what’ and you follow it up with a multi-choice open follow up. It’s a better sense of why. You could build around the original people who answered positively or negatively, there’s a really simple way, a low-cost way of getting the why to the what without having to do more research or scratching your head at advanced analytics the data that looks at correlations the what. It’s a really simple mechanism and gets you far more value from the research. As I said, a lot of organisations are looking through the review mirror and how people have felt yesterday, the day before, and so on. Looking through the windscreen is even more valuable. Again, this isn’t rocket science. This is what consumer research does again and again to Liam’s point earlier – no senior leadership team would expect their marketing team to come to them with views on how a product landed with no previous research on what the consumer wants, what the audience needs, how the product was tested before and so on, whereas in internal communications we shape a message, send it out there and ask, how did that land. So, using research, comms and engagement research to find future solutions is critical.

Organisations like Tesco, like BP, like Barclays, many others are, using questions to inform the decisions they make, the messages they show the channels they use. One killer question that BP use – Bernard Looney an evangelist for listening to his workforce and using that for what he says and does is a really good example. They ask every single week questions to about 3,000 employees. The really important one from a comms perspective is this very simple one “What one single question would you like Bernard Looney to answer in communications over the coming months?”. What that helps you do is plan effectively your communications, the messages, the things that people want to hear so that you can course correct, focus, target your messages accordingly.

Really simple way of doing it, really powerful outputs, qualitative feedback, that you can again thematically group and focus your messaging on. Continuous listening, the focus on listening again and again to shape and evolve the way you communicate the way you support teams is really important, but often that is on the back of people the saying the annual survey and the deep dive census of our workforce, not directly a comms focus, more an engagement and culture focus but nonetheless it’s still relevant here. The loss of that leaves you with really big misses. Without the annual census, or an audit of your population in terms of its comms and so on, you aren’t able to deep dive into the hierarchy, down to the most granular layers simply because there is not a single piece of continuous research, or weekly, or monthly or quarterly sampling-based research that is conducted to listen to the workforce that gets more than 40 or 50% response rate.

It just doesn’t happen, you don’t have organisational focus significantly at getting that listening in a deep enough way and therefore you can’t report down to the team level, if you want to focus on team behaviours in communication, and so on. The other key thing is you can’t do &I analysis based on samples. It’s just not possible – you can do it around gender because obviously you have large enough samples, but when it comes to ethnicity, religion LGBTQ+ you can’t do it with any level of guarantee that it’s meaningful. And of course, it doesn’t allow you to do, a strongly, the kind of integrated analytics for customer and performance a data because again you haven’t got enough data to do it.

Co-op is a great example of where they do ongoing listening, complimented by an annual census. And a lot of organisations are now going down that path. They’ve learnt that their lesson, so to speak that yes continuous listening is a critical part of their research landscape, but they’ve then needed to reintroduce their annual census, or keep one to get those additional elements. And the other key thing, is shaping continuous listening, particularly from a comms perspective is being clear about what you’re continuing to listen to. What kind of research questions you run. A critical finding we have, we’ve identified is that what people know, and what people feel, changes very regularly.

Daily, weekly monthly, depending on the events externally, big change, what leaders say and do etc. Therefore, testing that weekly, monthly quarterly, whatever, is absolutely find, because you are able to monitor the impact of those events and activities on how people feel and what they know. What doesn’t change very quickly is the culture, the experience of behaviours the experience of employees have of the systems and processes and so. And organisations that go, oh you know We absolutely must run quarterly; we must run ongoing customer culture research on a weekly, monthly basis are being sold a dud. Because all you’re going to see if v-e-r-y slow evolution and therefore it’ a waste of questions, it’s a waste of money to do ongoing culture research that is conducted any more often that 12 months cycles, because you’re just not going to get much meaningful traction. The integration of data is critical, absolutely running data that gives you individual scores around, for example, what people are seeing, is important. How leaders are communicating is important. Whether leaders trust what they say, whether employee trust what leaders say, is important, but it’s when you integrate them together as this example shows, when you integrate different data together that you get real value.

So, here’s a classic example that Aviva has done with their research, others have done - this isn’t Aviva data but it’s very common what we find, is that when you integrate how people are hearing from leaders with whether or not employee trust what their leaders say, what you find is you get certain types of channels, certain types of communications that correlate really strongly with higher levels of trust and other that don’t. Guess what, the kind of one way often email based, films, instant messaging etc. correlate poorly, with lower levels of engagement. Because what you are doing is broadcasting at a broader population some of whom aren’t having an interaction with either their manager or a leader. The interactivity chain through social and then physical, and I say physical including Teams etc. you go in the interactivity chain the greater the level of trust.

Seeing the whites of your leader’s eyes, hearing their voices, getting them a sense of empathy, a sense of authenticity, where it exists and so on, drives up trust and drives up engagement. It’s not just about integration of data within research. Getting a sense of, for example, how often teams, or team managers communication with their teams, interact with their teams, is a really powerful measure in and of itself. But when you integrate that with things like engagement, but also absence data, or other hard metrics you get really hard follow up data, that to Liam’s point, to show leadership teams that by having more regular catch ups the team leaders have more open in a discursive way with their teams, they not only drive up engagement, they ore critically drive up the right performance metrics, driving down absence, driving up sales in the co-op and in others that we’ve found that greater interactivity and regularity of team communications and effectiveness of team communications, manager and leadership behaviours and encouraging communications, creating a safe space for the discussions, recognising team members, communicating effectively all of those things drive up sales drive down absence and attrition. So, I have tried to squeeze a twenty/thirty-minute presentation into a 10 minute or there-abouts focus around how, yes insight is critical, but it’s getting the insight right that is even more fundamental.

Because you go into a leadership team with data that is vanilla that isn’t actionable that is biased that doesn’t enable you to help advise them properly on what you do and very quickly leaders will stop listening. So, it isn’t just about conducting research, the use of the data, it’s how you do it, as well.

 

Q&A from the session


Where does behavioural science fit into all of this? 
Well, behavioural science is at the heart of this. need an understanding of how people think and behave. Whether it be in the way they answer surveys (as I said at the outset) in the way that you nudge, you can create solutions that you nudge for and by using research to understand the personas of types of employees, think, behave, how people come to work

There’s a great example of in British Gas years ago, we identified that behaviourally, how engineers on the road travelled around, how they did their jobs, what motivated them in terms of what kinds of communications they wanted to hear and listen to enable the design of a much more effective solution at what we called British Gas radio, because as those engineers were driving round, that’s the only point at which you could get to them and therefore listening to the story and the news and the information was the only way you could get at them and get at them meaningfully because they spent literally a third of their day driving around. They were a captive audience, so understanding that and the behaviours and the motivations and way people work is critical to designing the right communication solutions.


Which of these five fundamental principles is the most crucial in your view? 
That’s a really good one, it’s like asking, which of your children you love the most, you love them all in different ways they all have a role to play. That’s one of the hardest questions I’ve ever been asked. I think it’s number three, looking through the windscreen. The reason being leaders don’t just want, leaders do want to know what people are feeling, I came into the doctor and saying can you give me an assessment of where I’m at physically. And the doctor tells you and you say OK now you’ve got to, you’ve told me about my problem, what do I do? Going to leaders with solutions predicated on what your audience is saying is fundamental. And therefore, using questions using research to define decisions you are going to make not just feedback on what you’ve already done, the message the town hall the activity, the engagement solution is critical.

Organisations should be taking a leaf out of marketeers and always, there should be an 80/20 rule. 20 percent of what you’ve already done, message testing, activity testing, setting where your population is at what they think, feel, and do, 80 percent focused on the future.

So, you can course correct the messaging, you can shape activity, shape solutions that are relevant. So, we are running lots of research with banks at the moment about the future of their workplace. In terms of how organisations are re-configuring their property portfolio, how they communicate about the future of working patterns etc. but what a lot of them are doing is using deep insight into where their employee’s fears and motivation, what they know what they need etc to design those future solutions, not just design and test how well they are landing. 
 

 

 

Joining the dots - using innovative, actionable employee insight to influence what employees think, feel and do
Karian and Box

Latest insights

Let’s talk

We’d love to hear from you to discuss your challenges and to share our views on how we could work together to solve them. Please get in touch to set up an introductory call or to arrange a credentials meeting.

 

Please enter your name
Please enter your company
Please enter correct email
Please tell us your challenge

 

 

Thank you for contacting us.

We will reply to you in the next two working days.