Poor decision making doesn’t get better in time

Five years ago in the UK, on 16 March, the Prime Minister of the UK uttered the following words, “…now is the time for everyone to stop non-essential contact with others and to stop all unnecessary travel. We need people to start working from home where they possibly can. And you should avoid pubs, clubs theatres and other such social venues”. It was a call for the country to support the Government in facing into the global pandemic and many organisations duly followed that advice. Five years on, whilst the majority of us are happily frequenting pubs, clubs, theatres and other social venues, we are still getting ourselves in a pickle over working from home. So what went on?

I’ve written so many times about this topic and I always feel I need to make the following caveat clear – I am not making a value judgment about where people work or the decisions that organisations take. Where I get a little grumpy around the edges is the inference that “everyone is working from home” – this is just plain wrong – or that not allowing people to work from home is somehow “old fashioned” or “lacking in trust” – again this is factually wrong and as intellectually limited as saying that anyone who works from home is skiving.

But I don’t want to talk about the pros and the cons, those have been well debated to the point of exhaustion. I want to talk about how decisions are made, and how we get them wrong.

In every day up to 16 March 2020, most organisations had a pretty stable work pattern. Lots had flexibility built in to that in different ways, others didn’t. Organisations were based in different locations around the country and the globe and job seekers made decisions about where they’d work depending on where they lived, how they could travel and where they were willing to move to. It was by no means perfect, but it was understood by all involved.

Then the world got complicated for a period of about a year and we had to make changes, show flexibility, behave in different ways in order to support the collective need. Roll forward five years and most of those aspects of our lives have pretty much returned to the “normality” of pre pandemic operations and whilst I’d love people to continue ton socially distance (but that’s just me being anti social) that isn’t going to happen any time soon.

Where people work, however, is still a bone of contention for lots of organisations. So what happened in this debate that made it so different to all the other temporary changes? Lots of organisations announced very quickly that they’d were making permanent changes. Why?

  • HR leaders advocated for policies that suited their own working preferences rather than business need and suggested this was a market trend (“the future of work”) as more announced the change that became a self fulfilling prophecy.
  • Finance leaders saw an opportunity to reduce the cost of property on their businesses by either disposing of real estate or exiting leases. Meaning that there was less space in their premises even if people wanted to work there.
  • Employees, at least the vocal ones, announced they were more productive and generally happier. Let’s not forget that the weather in the summer of 2020 was particularly nice too. Dissenting voices or those that questioned the direction were judged to be modern luddites.

And after a turbulent period of time, it felt like a win-win-win. What was lacking was any real strategy, any data or evidence, any proper business case or evaluation of alternative outcomes. Whether you agree with the outcome or not, the decision making process was woefully poor and counter to the way that organisations would make any other major change.

Five years on and some organisations are rowing back on their positions and with it there is more grief, more upset and hurt, more conflict with parts of the workforce. Understandably, employees feel they were told one thing, promised one future, and are now being delivered another. One day, becomes, two days, becomes three or four – even Sainsbury’s are noticing the change and signalling the return to the “weekly shop”. All of this could have been avoided by more thought, better decision making processes, and a little bit more sangfroid. Poor decision making happens, no one is immune, but the one thing we can almost guarantee is that when they do, they never get better with time – no matter how long you leave them.

If you want fairness, you might need to give something up

I came into business on the back of studying Psychology many light years ago before it became such a hot topic for undergraduates. Fortunately for me, back then you didn’t need three straight As to get anywhere near the hallowed halls of university. The course that I took, and the modules I chose, focused a lot on child development something that I’ve light heartedly (and sometimes seriously) suggested prepared me well for dealing with fellow execs and the world of work.

Let’s take the concept of fairness. There are countless studies that show that a quite a young age, around three or four, children understand the principles of fairness. They understand that if you have three toys and two friends, then the fair thing is for every child to have a toy to play with. But there is often a gap between understanding and behaviour depending on the context being observed – whether there is a level of collaboration, whether rewards are given separately, whether there is a windfall.

Fast forward into the world of work and whilst we aren’t necessarily talking about toys anymore (unless you work for Hasbro or Lego), the concept of fairness is something that we talk about a lot. For example, we’ve all seen the various cartoons and explanations of the difference between, equality and equity. Similarly, studies have shown that employees view fairness at work as being one of the fundamental pillars of “a good work place”.

But similarly to the kids in the experiments who understand the concept of fairness but don’t want to share their toys, there is often a gap between our desire for a fair outcome and our willingness to accept that this might mean we, ourselves, need to give something up. Let’s go back to where we started this post, in getting into university. Unless university places increase exponentially (and there’s another post in here somewhere about whether they have tried to do this at great cost to young people) then by increasing access and widening participation it is likely to mean that the children of people who might previously attended without batting an eyelid, might not be getting into the university or course of their choice.

The whole concept of WFH (yes I am going to mention this again) is another example of us not factoring fairness into our own decision making. The proponents of WFH will often say, “I am more productive”, or “I can work like this, so why shouldn’t I?”. And of course they’re entirely correct and in some organisations where everyone can do the same, say a digital marketing agency, that might be a fair direction of travel. But in those where that isn’t the case, let’s say a retail bank, how many would give up their own freedoms in order to create a fairer workplace for those that can’t?

Why is this important? Is this just another opportunity for Neil to have a pop at working from home? Well no, not on this occasion, it is important because as leaders, as people professionals, it is fundamentally our responsibility to shape workplaces that are as fair as possible. And to do that we need to do a number of things; we need to make decisions that won’t be popular with some and not be lured by the idea that fairness and approval are the same thing; we need to be very aware of our own perceptions of fair and what we personally might need to give up; and we need to recognise that it will never be perfect and that not being perfect is ok as long as we are constantly checking in on our decisions and our approach and how we can make them fairer, little by little, bit by bit.

In praise of personnel

I started working in the profession in 1996, the year that saw Take That split and the airing of the last episode of the Fresh Prince of Bel Air – although I don’t remember either of these things in much detail, I admit I had to look them up. AOL was also named the most popular website of the year, for the c.20 million people that had access to the Internet, but behind the scenes a little known company called Google was indexing the web – but it wouldn’t have it’s own domain until the following year.

At work, I wrote out memos that were typed by a typing pool and deliver by hand in the internal mail system. And I was called a Personnel Services Officer, worked in Personnel – and I provided services to the personnel.

Nearly 30 years later, the world and the world of work has changed considerably. I’m writing this on technology that I couldn’t envisage would exist, to share on platforms that weren’t in existence. So much has changed and yes the fundamentals of how we come together to get things done – an activity also known as work – hasn’t changed that much at all. These days, whilst I don’t go by the title of Director of Personnel, I have stuck to the HR description and frankly, I’ve got no desire to change it.

A quick search in Linkedin will deliver you a cacophony of job titles for people doing the same and similar jobs. There are trends, counter trends, justification for changes (normally something about being more strategic – but we all know, calling yourself a “thought leader” doesn’t make you one). And all of these tiles and descriptions are on one hand fine, but also beg a fundamental question;

Who is a job title for?

Is it for the individual, so that it represents what they want to be seen as or how they want others to feel about them? Or, is it for everyone else so they understand what that person does, what they’re responsible for and when they might be helpful or when to get them involved?

If our jobs as leaders and people professionals is to make organisations simpler, easier to navigate, more effective and efficient, then using simple and straightforward language might not be a bad place to start. Job titles, department and function names are how people make sense of the organisation, they’re a universally recognised shorthand that help us to get things done. Where do I go, who do I speak to in order to carry out the task that I need to get done?

Marathon bars didn’t get tastier because they were called Snickers, Twitter didn’t become a better place because it was named X, The Independent didn’t get better editorially as The i, and we all know Hermes didn’t stop chucking parcels over hedges with it became Evri. In some ways, names and job titles don’t matter at all but in other ways they absolutely do.

Ok, so maybe Personnel was a little bit dated but people knew what it was and what it did. And sometimes, that has greater value to organisational performance than any rebrand simply to assuage the egos of the job holders. That’s something would could all do with a little bit more.

The big admin blob that drains us all

I was struck the other day by a post that Tim Baker shared on Linkedin suggesting that one in three HR professionals in the UK were considering leaving the profession, with 41% suggesting unnecessary admin as one of the causes. Now, of course, the company behind the research has a solution and….surprise, surprise…that happens to be exactly what they do as a company. But despite my loathing of this kind of “research”, (so much so that I’ll share link above to Tim because he’s worth connecting with, but I won’t share any links to the company, because…it is very average), yes, despite all of this, it did get me thinking.

I wouldn’t mind betting if you asked 100 random employees what they thought of HR in their organisation, one of the things the majority would raise is the unnecessary paperwork, the bureaucracy, the processes that seem both endless and pointless. And I wouldn’t mind betting if you spoke to 100 random HR professionals and asked them what they liked least about their jobs, they’d say the unnecessary paperwork, the bureaucracy and the processes that nobody seems to follow and so are endless and pointless.

So what on earth is going on? Who is responsible and simply, why can’t this all stop? As someone who has raged against process most of my career, to the understandable frustration and eye rolling of my colleagues, and who has (sometimes successfully but mostly unsuccessfully) tried to reduce it, I’ve got a few theories:

  1. The power of one – Every form that’s created or process that is added is only looked at as a singular piece of work, not in the entirety of the experience. So every time something is added, it seems eminently reasonable in isolation. But lumped together with everything else, the whole thing becomes an unmanageable blob.
  2. The lack of measures – I don’t know of any organisations, although they might exist, where there is a firm rule on the amount of admin that any one person is expected to do and therefore a finite limit. Why does this matter? Because if you had a firm rule and you were at the limit, then to add something in, you’d need to take something out.
  3. The fear of lawyers – Well, it isn’t really the lawyers, they’re generally an amiable bunch, it is the over regulation and imposition of onerous burdens of proof on the employment relationship that means that the simplest way to defend against anything is to document it to within an inch of it’s life. Although, I’m told by those on the inside that law companies are the worst for following any kind of process – surely what’s good for the goose is good for the gander?
  4. The calibre of the profession – If you’ve only ever worked in process heavy, admin focused HR functions then how can you be expected to know that anything else is possible? And anyone who suggests it can must be mad, bad or ready for retirement! But where are the creative thinkers coming through the profession who want to shape a completely different future of work? Oh yes, they’re working from home and on Teams meetings all day…adding value.
  5. The belief in a silver bullet – The very average research was carried out by a company that sells tech solutions. As long as I’ve worked in the profession we’ve been told that tech is going to be the answer, most recently AI. Systemising or automating rubbish doesn’t stop it being rubbish, it just makes it expensive rubbish that likely disappoints.

HR teams say they want to be more strategic and less admin focused, yet they are the ones that create the admin in the first place. Businesses say they want their HR teams to be more strategic and less admin focused, but they rarely hold them to the account. Managers say they want to be able to get on with their jobs, but they don’t want to take the responsibility for making decisions. Sometimes it seems to me we all want the same things, but maybe it suits us all better the way that it is.