Campaign Analytics: What Separates The Good From The Bad

Donald Trump, as a candidate for President last year, expressed great skepticism about the use of analytics
in an election campaign.  Hillary Clinton made a big deal about her campaign’s use of analytics. Before that, President Obama’s campaigns received great credit for their analytics.

If you compare these experiences, you can begin to understand what separates good from bad in campaign analytics.

Let’s start with the Clinton campaign, whose use of analytics was breathlessly reported, including this Politico story about “Hillary’s Nerd Squad” eighteen months before the election.

However, a newly released book, titled Shattered, provides a kind of autopsy of the campaign and its major weaknesses. A CBS News review of the book highlighted this
weakness in particular:

“Campaign manager Robby Mook put a lot of faith in the campaign’s computer algorithm, Ada, which was supposed to give them a leg up in turning out likely voters. But the Clinton campaign’s use of the highly complex algorithm focused on ensuring voter turnout, rather than attracting voters from across party lines.

“According to the book, Mook was insistent that the software would be revered as the campaign’s secret weapon once Clinton won the White House. With his commitment to Ada and the provided data analytics, Mook often butted heads with Democratic Party officials, who were concerned about the lack of attention in persuading undecided voters in Clinton’s favor.  Those Democratic officials, as it turned out, had a point.”

image

Of course, this had become part of the conventional wisdom since the day after the election. For example, on November 9, 2016, the Washington Post had a story “Clinton’s data-driven campaign relied heavily on an algorithm named Ada. What didn’t she see?”:

“Ada is a complex computer algorithm that the campaign was prepared to publicly unveil after the election as its invisible guiding hand … the algorithm was said to play a role in virtually every strategic decision Clinton aides made, including where and when to deploy the candidate and her battalion of surrogates and where to air television ads … The campaign’s deployment of other resources — including county-level campaign offices and the staging of high-profile concerts with stars like Jay Z and Beyoncé — was largely dependent on Ada’s work, as well.”

But the story had another point about Ada:

“Like the candidate herself, she had a penchant for secrecy and a private server … the particulars of Ada’s work were kept under tight wraps, according to aides. The algorithm operated on a separate computer server than the rest of the Clinton operation as a security precaution, and only a few senior aides were able to access it.”

While the algorithm clearly wasn’t the only or perhaps even the most important reason for the failure of the campaign, that last piece illustrates why the Clinton use of analytics wasn’t more successful. It had in common with many other failed analytics initiatives an atmosphere of secretiveness and arrogance – “we’re the smartest guys around here” so let us do our thing.

The successful uses of analytics in campaigns or elsewhere try to use (and then test) the best insights of the people with long experience in a field. They will even help the analyst look at the right questions –
in the case of the Clinton campaign, converting undecided voters

The best analytics efforts are a two-way conversation that helps the “experts” to understand better which of their beliefs are still correct and helps the analytics staff to understand where they should be looking for predictive factors.

Again, analytics wasn’t the only factor that led to President Obama’s winning elections in 2008 and 2012, but the Obama campaign’s use of analytics felt different than Clinton’s. One article went “Inside the Obama Campaign’s Big Data Analytics Culture” and described “an archetypical story of an analytics-driven organization that aligned people, business processes and technologies around a clear mission” instead of focusing on the secret sauce and a top-down, often strife-filled, environment.

image

InfoWorld’s story about the 2012 campaign described a widely dispersed use of analytics –

“Of the 100 analytics staffers, 50 worked in a dedicated analytics department, 20 analysts were spread throughout the campaign’s various headquarters, and another 30 were in the field interpreting the data.” So, there was plenty of opportunity for analytics staffers to learn from others in the campaign.

And the organizational culture was molded to make this successful as well –

“barriers between disparate data sets – as well as between analysts – were lowered, so everyone could work together effectively. In a nutshell, the campaign sought a friction-free analytic environment.”

Obama’s successful use of analytics was a wake-up call to many politicians, Hillary Clinton included. But did they learn all the lessons of his success? Apparently not.

Coming back to the 2016 election, there is then the Trump campaign. Despite the candidate’s statements, his campaign also used analytics, employing Cambridge Analytica, the British firm that helped the Brexit forces to win in the UK. Thus, 2016 wasn’t as much of a test of analytics vs. no analytics as has sometimes been reported.

image

But, if an article, “The great British Brexit robbery: how our democracy was hijacked”, published two weeks ago in the British newspaper, the Guardian, is even close to the mark, there is a different question about the good and bad uses of analytics in both the Trump and Brexit campaigns. In part scary and perhaps in others too jaundiced, this story raises questions for the future – as analytic tools get better, will the people using those tools realize they face not only technical challenges.

The good and bad use of analytics will not just be a question as to whether the results are being executed well or poorly – whether the necessary changes and learning among all members of an organization take place. But it will also be a question whether analytics tools are being used in ways that are good or bad in an ethical sense.

© 2017 Norman Jacknis, All Rights Reserved. @NormanJacknis

Analytics And Leading Change

Next week, I’m teaching the summer semester version of my Columbia University course called Analytics and Leading Change for the Master’s Degree program in Applied Analytics. While there are elective courses on change management in business and public administration schools, this combination of analytics and change is unusual. The course is also a requirement. Naturally, I’ve been why?

The general answer is that analytics and change are intertwined.

Successfully introducing analytics into an organization shares all the difficulties of introducing any new technology, but more so. The impact of analytics – if successful – requires change, often deep change that can challenge the way that executives have long thought about the effect of what they were doing.

As a result, often the reaction to new analytics insights can be a kneejerk rejection, as one Forbes columnist asked last year in an article titled “Why Do We Frequently Question Data But Not Assumptions?”.

A good, but early example of the impact of what we now call “big data”, goes back twenty-five years ago to the days before downloaded music.

Back then, the top 40 selections of music on the “air” were based on what radio DJs (or program directors) chose and, beyond that, the best information about market trends came from surveys of ad hoc observations by record store clerks.  Those choices too emphasized new mainstream rock and pop music.

In 1991, in one of the earliest big data efforts in retail, a new company, SoundScan, came along and collected data from automated sales registers in music. What they found went against the view of the world that was then widely accepted – 
and instead

old music, like Frank Sinatra, and genres others than rock were very popular.

Music industry executives then had to change the way they thought about the market and many of them didn’t. This would happen again when streaming music came along. (For more on this bit of big data history, see https://en.wikipedia.org/wiki/Nielsen_SoundScan and http://articles.latimes.com/1991-12-08/entertainment/ca-85_1_sales-figures .)

A somewhat more recent example is the way that insights from analytics have challenged some of the traditional assumptions about motivation that are held by many executives and many staff in corporate human resource departments. Tom Davenport’s Harvard Business Review article in 2010 on “Competing on Talent Analytics” provides a good review of what can be learned, if executives are willing to learn from analytics.

The first, larger lesson is: If the leaders of analytics initiatives don’t understand the nature of the changes they are asking of their colleagues, then those efforts will end up being nice research reports and the wonderful insights generated by the analysts will disappear without impact or benefit to their organizations.

The other side of the coin and the second reason that analytics and change leadership are intertwined is a more positive one. Analytics leaders have a potential advantage over other “change agents” in understanding how to change an organization. They can use analytics tools to understand what they’re dealing with and thus increase the likelihood that the change will stick.

For instance, with the rise of social networks on the internet, network analytics methods have developed to understand how the individuals in a large group of people influence each other. Isn’t that also an issue in understanding the informal, perhaps the real, structure of an organization which the traditional organization charts don’t illuminate?

In another, if imperfect example, the Netherlands office of Deloitte created a Change Adoption Profiler to help leaders figure out the different reactions of people to proposed changes.

Unfortunately, leaders of analytics in many organizations too infrequently use their own tools to learn what they need to do and how well they are doing it. Pick your motto about this – “eat your own lunch (or dogfood)” or “doctor heal thyself” or whatever – but you get the point.

© 2017 Norman Jacknis, All Rights Reserved. @NormanJacknis

Augmented Reality Rising

Last week, I gave a presentation at the Premier CIO Summit in Connecticut on the Future of User Interaction With Technology, especially the combined effects of developments in communicating without a keyboard, augmented reality (AR) and machine learning.  I’ve been interested in this for some time and have written about AR as part of the Wearables movement and what I call EyeTech.

First, it would help to distinguish these digital realities. In virtual reality, a person is placed in a completely virtual world, eyes fully covered by a VR headset – it’s 100% digital immersion. It is ideal for games, space exploration, and movies, among other yet to be created uses.

With augmented reality, there is a digital layer that is added onto the real physical world. People look through a device – a smartphone, special glasses and the like – that still lets them see the real things in front of them.

Some experts make a further distinction by talking about mixed reality in which that digital layer enables people to control things in the physical environment. But again, people can still see and navigate through that physical environment.

When augmented was first made possible, especially with smartphones, there were a variety of interesting but not widespread uses. A good example is the way that some locations could show the history of what happened in a building a long time ago, so-called “thick-mapping”.

There were business cards that could popup an introduction and a variety of ancillary information that can’t fit on a card, as in this video.

There were online catalogs that enabled consumers to see how a product would fit in their homes. These videos from Augment and Ikea are good examples of what’s been done in AR.

A few years later, now, this audience was very interested in learning about and seeing what’s going on with augmented reality. And why not? After a long time under the radar or in the shadow of Virtual Reality hype, there is an acceleration of interest in augmented (and mixed) reality.

Although it was easy to satirize the players in last year’s Pokémon Go craze, that phenomenon brought renewed attention to augmented reality via smart phones.

Just in the last couple of weeks, Mark Zuckerberg at the annual Facebook developers conference stated that he thinks augmented reality is going to have tremendous impact and he wants to build the ecosystem for it. See https://www.nytimes.com/2017/04/18/technology/mark-zuckerberg-sees-augmented-reality-ecosystem-in-facebook.html

As beginning of the article puts it:

“Facebook’s chief executive, Mark Zuckerberg, has long rued the day that Apple and Google beat him to building smartphones, which now underpin many people’s digital lives. Ever since, he has searched for the next frontier of modern computing and how to be a part of it from the start.

“Now, Mr. Zuckerberg is betting he has found it: the real world. On Tuesday, Mr. Zuckerberg introduced what he positioned as the first mainstream augmented reality platform, a way for people to view and digitally manipulate the physical world around them through the lens of their smartphone cameras.”

And shortly before that, an industry group – UI LABS and The Augmented Reality for Enterprise Alliance (AREA) – united to plot the direction and standards for augmented reality, especially now that the applications are taking off inside factories, warehouses and offices, as much as in the consumer market. See http://www.uilabs.org/press/manufacturers-unite-to-shape-the-future-of-augmented-reality/

Of course, HoloLens from Microsoft continues to provide all kinds of fascinating uses of augmented reality as these examples from a medical school or field service show.

Looking a bit further down the road, the trend that will make this all the more impactful for CIOs and other IT leaders is how advances in artificial intelligence (even affective computing), the Internet of Things and analytics will provide a much deeper digital layer that will truly augment reality. This then becomes part of a whole new way of interacting with and benefiting from technology.

© 2017 Norman Jacknis, All Rights Reserved. @NormanJacknis

Interactivity For An Urban Digital Experience

This is the third and last of a series of posts about a new urban digital experience in the streets of Yonkers, New York. [You can the previous posts, click on part1 and part2.]

As a reminder, the two main goals of this project are:

  • To enhance the street life of the city by offering delightful destinations and interesting experience, a new kind of urban design
  • To engage, entertain, educate and reinforce the image of Yonkers as an historic center of innovation and to inspire the creativity of its current residents

We started out with a wide variety of content that entertains, educates and reinforces the residents’ understanding of their city. As the City government takes over full control of this, the next phase will be about deepening the engagement and interactivity with pedestrians – what will really make this a new tool of urban design.

This post is devoted to just a few of the possible ways that a digital experience on the streets can become more interactive.

First, a note about equipment and software. I’ve mentioned the high-quality HD projectors and outdoor speakers. I haven’t mentioned the cameras that are also installed. Those cameras have been used so far to make sure that the system is operating properly. But the best use of cameras is as one part of seeing – and with the proper software – analyzing what people are doing when they see the projections or hear something.

The smartphones that people carry as they pass by also allow them to communicate via websites, social media or even their movement.

With all this in place, it helps to think of what can happen in these four categories:

  1. Contests
  2. Control of Text
  3. Physical Interaction
  4. Teleportation

Contests

What’s your favorite part of the city? Show a dozen or so pictures and let people vote on them – and show real time results. It’s not a deeply significant engagement, but it will bring people out to show support for their area or destination.

Or people can be asked: what are your top choices in an amateur poetry contest (which only requires audio) or the best photography of the waterfront or a beautiful park or the favorite item that has been 3D printed inside the library’s makerspace? Or???

Even the content itself can be assessed in this way. We can ask passersby to provide thumbs up or down for what it is they are seeing at that moment. (Since the schedule of content is known precisely this means that we would also know what the person was referring to.)

People could vote on what kind of music they would want to hear at the moment, like an outdoor jukebox, or on what videos they might want to see at the moment.

Contests of this kind are a pretty straightforward use of either smartphones or physical gestures. Cameras can detect when people point to something to make a choice. It is possible to use phone SMS texting to register votes and the nice thing about this use of SMS is that it doesn’t require anyone to edit and censor what people write since they can only select among the (usually numerical) choices they’re given. SMS voting can be supplemented with voting on a website.

Control Of Text

Control implies that the person in front of a site can control what’s there merely by typing some text on a smart phone – or eventually by speaking to a microphone that is backed by speech recognition software.

People can ask about the history of people who have moved to Yonkers by typing in a family name, which then triggers an app that searches the local family database.

This kind of interaction requires that someone or a service provides basic editing of the text provided by people (i.e., censorship of words and ideas not appropriate for a site frequented by the general public).

Physical Interaction

With software that can understand or at least react to the movement of human hands, feet and bodies, there are all kinds of possible ways that people can interact with a blended physical/digital environment.

In a place like Getty Square where the projectors point down to the ground, it’s possible to show dance steps. Or people can modify an animation or visual on a wall by waving their arms in a particular way.

Originally in Australia, but now elsewhere, stairs have been digitized so that they play musical notes when people walk on them. These “piano stairs” are relatively easy to create and actually don’t really need to be stairs at all – the same effect can be created on a flat surface and it doesn’t have to generate piano sounds only.

In Eindhoven, the Netherlands, there is an installation called Lightfall, where a person’s movements control the lighting. See https://vimeo.com/192203302

Pedestrians could even become part of the visual on a wall and using augmented reality even transformed, say into the founder of the city with appropriate old clothes. Again, the only limit is the creativity of those involved in designing these opportunities.

Teleportation

The last category I’m calling teleportation, although it’s not really what we’ve seen in Star Trek. Instead with cameras, microphones, speakers and screens in one city and a companion setup in another, it would be possible for people in both places to casually chat as if they were on neighboring benches in the same park.

In this way, the blending of the physical and digital provides the residents with a “window” to another city.

I hope this three-part series has given city leaders and others who care about the urban environment as good sense of how to make 21st blended environments, how they might start with available content and then go beyond that to interaction with people walking by.

Of course, even three blog posts are limited, so feel free to contact me @NormanJacknis for more information and questions.

© 2017 Norman Jacknis, All Rights Reserved