Next week, I’m teaching the summer semester version of my Columbia University course called Analytics and Leading Change for the Master’s Degree program in Applied Analytics. While there are elective courses on change management in business and public administration schools, this combination of analytics and change is unusual. The course is also a requirement. Naturally, I’ve been why?
The general answer is that analytics and change are intertwined.
Successfully introducing analytics into an organization shares all the difficulties of introducing any new technology, but more so. The impact of analytics – if successful – requires change, often deep change that can challenge the way that executives have long thought about the effect of what they were doing.
As a result, often the reaction to new analytics insights can be a kneejerk rejection, as one Forbes columnist asked last year in an article titled “Why Do We Frequently Question Data But Not Assumptions?”.
A good, but early example of the impact of what we now call “big data”, goes back twenty-five years ago to the days before downloaded music.
Back then, the top 40 selections of music on the “air” were based on what radio DJs (or program directors) chose and, beyond that, the best information about market trends came from surveys of ad hoc observations by record store clerks. Those choices too emphasized new mainstream rock and pop music.
In 1991, in one of the earliest big data efforts in retail, a new company, SoundScan, came along and collected data from automated sales registers in music. What they found went against the view of the world that was then widely accepted –
and instead
old music, like Frank Sinatra, and genres others than rock were very popular.
Music industry executives then had to change the way they thought about the market and many of them didn’t. This would happen again when streaming music came along. (For more on this bit of big data history, see https://en.wikipedia.org/wiki/Nielsen_SoundScan and http://articles.latimes.com/1991-12-08/entertainment/ca-85_1_sales-figures .)
A somewhat more recent example is the way that insights from analytics have challenged some of the traditional assumptions about motivation that are held by many executives and many staff in corporate human resource departments. Tom Davenport’s Harvard Business Review article in 2010 on “Competing on Talent Analytics” provides a good review of what can be learned, if executives are willing to learn from analytics.
The first, larger lesson is: If the leaders of analytics initiatives don’t understand the nature of the changes they are asking of their colleagues, then those efforts will end up being nice research reports and the wonderful insights generated by the analysts will disappear without impact or benefit to their organizations.
The other side of the coin and the second reason that analytics and change leadership are intertwined is a more positive one. Analytics leaders have a potential advantage over other “change agents” in understanding how to change an organization. They can use analytics tools to understand what they’re dealing with and thus increase the likelihood that the change will stick.
For instance, with the rise of social networks on the internet, network analytics methods have developed to understand how the individuals in a large group of people influence each other. Isn’t that also an issue in understanding the informal, perhaps the real, structure of an organization which the traditional organization charts don’t illuminate?
In another, if imperfect example, the Netherlands office of Deloitte created a Change Adoption Profiler to help leaders figure out the different reactions of people to proposed changes.
Unfortunately, leaders of analytics in many organizations too infrequently use their own tools to learn what they need to do and how well they are doing it. Pick your motto about this – “eat your own lunch (or dogfood)” or “doctor heal thyself” or whatever – but you get the point.
© 2017 Norman Jacknis, All Rights Reserved. @NormanJacknis