Does Anyone Care about Corporate Culture Anymore?

Remember when “Corporate Culture” was the hottest topic in HR and business in general?

Does anyone care about corporate culture anymore or is it just about employer brand now? There is a difference that HAS to be talked about…

We used to AIM relentlessly for great company “cultures” that would create a competitive advantage and make us stand out as a great place to work. I have not heard anyone talking about this since pre-covid.

I DO hear about employment branding quite often. Creating the right marketing, social strategy and overall brand “identity” that attracts customers and candidates alike. That seems pretty important still for more companies….but…..is it just me or have companies forgotten about their actual culture within? As long as the “brand” is performing well – it feels like we’re less concerned with dynamics within.

Think of the “instagram vs. reality” analogy – Employer brand is the insta, and culture is the reality.

I’m seeing far too much focus put into how we position ourselves digitally vs. how we foster the behaviors and environments we actually want internally.

So, is it just the market conditions right now? Will this not matter until we’re all struggling to hire again?

Is it just the new way of the world (image over reality)?

How do we get back to the INTERNAL focus of fun, safe, innovative (etc. etc.) internal cultures/environments?

This is going to be more critical than ever with the amount of RTO initiatives happening, and yet – I’m not hearing anyone talk about it…

Thoughts?

  1. Unknown's avatar

Leave a comment