Psychological Safety ≠ “Be Vulnerable”

Psychological Safety has (finally) entered the mainstream discourse in business communities. Particularly in the world of #agile and #agile_coaching, it is a topic I hear referenced with increasing frequency and conviction. There’s just one problem.

Most people have no idea what it actually means.

Through various discussions, online and in person, in sessions at Agile meetups, and at numerous conventions, I’ve listened to agilists and consultants preaching the importance of Psychological Safety whilst touting the research which was surfaced by Google through Project Aristotle (though pioneered by researchers like Amy Edmundson).

Yet most managers, consultants, and agilists have embraced the concept of Psychological Safety without understanding the context of the term. Instead, the term Psychological Safety is often equated with the idea of “being vulnerable with your team.” Agile Coaches and Scrum Masters, focused on achieving that Holy Grail of coaching – building a high-performing team – are in the process of co-opting the term “Psychological Safety” to mean “intra-team-member trust.

Please. Stop.

Trust within a team is certainly a fundamental and vital characteristic to develop; absolutely no question there. Team members working together need to understand one another, trust in each other, communicate, collaborate, innovate, problem-solve, make decisions, implement, improve, learn, and produce together effectively. Trust is fundamental, and team trust is an important component of Psychological Safety.

Part of building that trust means, yes, taking some (perceived) interpersonal risk. Examples of perceived “risky behavior” in this area of interpersonal trust-building within a team might include:

  • Asking questions, which risks evidencing a perceived lack of knowledge, awareness, or understanding;
  • Questioning assumptions and/or voicing opinions, which risks challenging social and sometimes formal hierarchies and upsetting power relationships;
  • Admitting mistakes, which risks appearing like an under-performer at best, or incapable and incompetent at worst;

The point is, these sorts of things are clearly trust issues, but we have a word for these issues: trust.

What, then, is the difference between Trust and Psychological Safety?

Trust in this story is inter-personal and intra-team, which means it has to do with the degree to which individuals within the same team believe in the reliability, credibility, ability, and intentions of their teammates. We certainly cannot build an environment of Psychological Safety without Trust.

Psychological Safety, on the other hand, refers not only to the degree to which team members trust one another, but also to the environment (external) in which the team operates. In fostering intra-team trust, the team is neither attempting nor able to affect the external environment in which the team operates (the business or organization, its culture, explicit and implicit rules and processes, explicit and implicit structures and hierarchies, norms, values, etc.).

The environment in which the team functions will ultimately have the greatest effect on Psychological Safety. Regardless of how much team members trust one another, if the team operates in an environment which is hostile to open and honest communication, punishes individuals and teams for mistakes, or engages in or encourages retribution for perceived protocol infractions, slights, or failure to follow dogmatic processes, all the trust in the world isn’t going to develop Psychological Safety.

Psychological Safety

What does build Psychological Safety?

First and foremost, lead by example. The truth is, and there is ample social and cognitive psychological research to prove it, that people follow – and imitate – those they identify as leaders. The basic fact of our humanity is that when we perceive someone as “doing well,” we emulate those aspects of their personality and habit which we interpret as contributing to their success. We follow them (social media, online, print, at conferences and expos, etc.) and watch what they’re doing, and what they’re saying.

How this plays into Psychological Safety is simple. If you’re in charge of a team, department, or organization, and you want people to admit to (and learn from) mistakes along the way, start by exemplifying that behavior. Walk into meetings and start admitting the last big mistake you made, what it meant, and what you learned from it and plan to adjust in order to do better, next time.

Yes, it sounds crazy, but the first step is admitting your own fallibility and creating a culture in which it is okay to do that very thing. Re-framing failure as a learning opportunity, which is the central theme of The Lean Startup (by Eric Ries), you’re showing everyone around you that failure – and owning up to it – is perfectly fine when it is followed through with thoughtful analysis, learning, and meaningful change or improvement.

Many people in leadership positions (note that I do not refer to them as leaders) are far, far too preoccupied with proving that they know what they’re doing and “getting it right,” to actually recognize whether they’re “getting it right.” This is what many refer to as a Performance Mindset.

Performance Mindset: I’m here because I’m the expert, I know what I’m doing, I’m not wrong, I don’t make mistakes, and my job (and self-image) depends on my ability to perform!

Those in leadership positions who possess a Performance Mindset are typically unable to admit mistakes or fallibility, and tend to directly or indirectly punish others for making mistakes and can be prone to rebuking those who challenge their positions or opinions. It’s tough to be challenged when your perception of self – personal and professional – is grounded in an image of expertise and infallibility.

Developing Psychological Safety in teams and organizations which are led by individuals (execs, directors, or managers) who are firmly fixed in the Performance Mindset is exceptionally challenging. I wouldn’t say impossible – I’d just say I have yet to see it done.

To be certain, there are environments in which a Performance Mindset is exceptionally useful, if not vital to success. One of the underlying principles behind Traditional (Waterfall) Project Management is in fact the manager or leader who has a Performance Mindset. For companies and organizations who operate in simple-to-complicated domains, the presence of experts who have developed vast amounts of experience and knowledge over the course of many years of work is invaluable.

Simple or complicated work streams lend themselves perfectly to the application of best practices or known-good practices. We just need experts who can apply their knowledge and experience to help us achieve predictable, known outcomes.

Now, let’s step into Complexity. Complexity is a domain in which causal connections are unclear or hidden. We can make observations and discern clear correlations emerging between events, but it is impossible to definitively prove a causal relationship between events.

Enter the Complexity Mindset.

Complexity Mindset: I’m here because I know how to enable teams and individuals to make observations, analyze goals, make decisions, execute, reflect and learn, take improvements, adjust, and iterate!

The opposite of the Performance Mindset is the Complexity Mindset. Leaders with this mindset (regardless of their designated position within a hierarchy) focus on clearly articulating a vision and intent while empowering their teams to execute with as much autonomy as possible. The ultimate expression of this mindset is a company or organization in which teams understand at deep levels what their objectives and constraints are, and guide their own execution according to those objectives and constraints.

A Complexity Minded leader not only empowers teams to execute according to principles of self-organization, but also encourages and rewards learning through both success and failure. When you don’t know how to get from A to B, mistakes will occur.

The Lean Startup is, at its essence, a prolonged essay on the execution of a Complexity Mindset. When developing a new product which we don’t know anyone will like or use, as we learn about aspects of the product people like or dislike, we’ll adjust to build more of what they like, and less of what they don’t like. If people don’t like the product at all, we’ll look for positive, leading indicators which may point us to new products which then will have a better chance of success.

We are learning, despite failures. Iterating on a product, observing its adoption, deciding what to change, changing it, and returning to observe (ODAO variant of the OODA loop), we have a much greater chance of succeeding.

Software companies like Google, Amazon, Facebook, Microsoft, and Apple do this often. Facebook’s Dislike button, Tay (the offensive AI chatbot from Microsoft), Windows Vista… you get the idea. Heck, with a roughly 90% failure rate, the entire startup scene in Silicon Valley is a fantastic example of attempting to navigate social, financial, and consumer complexity through experimentation.

The point is, companies never start out with an intent to fail or make mistakes and neither do your teams. Yet the simple fact of reality is that mistakes and failures do happen. Especially when trying to do something that’s never been done before, build something that’s never been built before, etc. A mature, resilient, learning organization knows that challenges and setbacks are inevitable, and is able to overcome and succeed.

What about responsibility and accountability?

I quite often hear this refrain. “If we don’t hold people accountable,” (a code phrase for punishing people for mistakes), “people will think they can get away with anything!” False. First, if you hear this come from a leader, beware. They have just given you an insight into how they think and how they would likely behave, if given the chance.

The truth is, the 99% majority of people, especially in knowledge industry work, are there specifically because they want to do a good job. Furthermore, yes, there are clear red lines in terms of performance and accountability. This is the difference between honest mistakes and negligence.

When Amazon Web Services went down recently, taking a considerably large portion of the internet with it, Amazon was incredibly forthcoming in sharing their root cause diagnosis and the fact of the matter was that someone incorrectly entered a piece of text specifying the amount of infrastructure to take offline. It was an honest mistake.

I once watched as a friend attempted to enter TTYL (talk to you later) into an SMS to his wife, which quickly auto-corrected the phrase to Tina. As we were on a business trip, it could have raised some pretty legitimate concerns from his wife’s perspective. Instead, he replied “sorry, stupid auto-correct – meant TTYL” and sent that, instead.

Honest mistake.

Now, let’s imagine a scenario in which a government worker with a security clearance decides to take some classified work home with him on a USB drive to continue reviewing at home. At home, he plugs in the USB drive and the malware on his computer instantly dumps that classified information out to Wikileaks.

This is not an honest mistake. There are federal regulations (laws) which guide the handling of classified information. Those laws (in our example) have just been broken.

This is not an honest mistake. This is negligence (or worse).

Negligent (or worse, outright illegal) behavior needs to be corrected, yes, and those engaging in it need to be held to account, perhaps outright fired. But the guy who incorrectly typed a command wasn’t being negligent – he made an honest mistake.

Honest mistakes we want to know about because they offer us great opportunities for improving our systems. For example in the AWS case – why was an individual able to take down such a large system without having to confirm their typed command entry? Why were they able to incorrectly type the command (don’t they have copy/paste, or a UI)? How could we have enabled the system to prevent this sort of outage, itself?

Conclusion

Can you tell I’m passionate about this stuff???

By now I hope you’re developing the sense that Psychological Safety in teams is not fully within the team’s ability to control. Management and leadership have a significant ability to control whether teams feel empowered to admit mistakes, learn from them, and make improvements, or hide mistakes, silo important knowledge and information, and leave systems (and the organization) to succeed or fail on their own.

To develop and enhance Psychological Safety in your complex workplace, leaders need to adopt a Complexity Mindset and focus on the following:

  • Own your mistakes. That manager at AWS who refused to take on the work to improve the automation around service outages because it seemed not worth the return on investment? They’d better be standing up right now and admitting that they made that call, it was a bad call, and that Amazon indeed needs to invest in improving their outage automation. The quote sounds like this: “It wasn’t the engineer’s fault for causing the outage, it was my fault for not allowing the engineering team to prevent an outage before it occurred. My bad. Can someone help me fix it?”
  • Support your people and your teams, especially when they make honest mistakes. Help them re-frame those mistakes as opportunities to improve systems, processes, practices, and skillsets, and then empower them to make those improvements.
  • Reward individuals and teams for sharing mistakes, challenging assumptions, questioning decisions and policies, and striving to find “better ways.” Public support and thanks are powerful signals that those are desired behaviors, and will get you more of the same.
  • Encourage open environments in which people can freely share issues, concerns, and admit when they aren’t sure about how to solve a problem. Ensure that people asking for help get help, not ridicule. A culture in which individuals pride themselves on showing everyone how smart they are is a culture in which no one admits when they don’t know something, leaving you blind to risks and pitfalls. A culture in which people feel empowered to ask for help – and then get it – in order to solve challenging issues and mitigate potential risks, is a culture which will build resilient, adaptable systems and organizations.

Agree? Questions or comments? Do you entirely disagree and feel that Psychological Safety is actually all about admitting vulnerability? Let me hear about it. If you don’t want to respond here in the comments, send me an email! I’m more than happy to hear from you, engage in constructive debate, and learn about where I have it wrong. Because I make mistakes, too.

By the way – thank you for reading!

 

This post originally appeared March 9th, 2017 on LinkedIn here: https://www.linkedin.com/pulse/psychological-safety-vulnerable-chris-alexander

Save

Save

Share This: