Like many other people who have been watching the COVID-19 press conferences held by Trump and Cuomo, I came away with a very different feeling from each. Beyond the obvious policy and partisan differences, I felt there is something more going on.
Coincidentally, I’ve been doing some research on text analytics/natural language processing on a different topic. So, I decided to use these same research tools on the transcripts of their press conferences from April 9 through April 16, 2020. (Thank you to the folks at Rev.com for making available these transcripts.)
One of the best approaches is known by its initials, LIWC, and was created some time ago by Pennebaker and colleagues to assess especially the psycho-social dimensions of texts. It’s worth noting that this assessment is based purely on the text – their words – and doesn’t include non-verbal communications, like body language.
While there were some unsurprising results to people familiar with both Trump and Cuomo, there are also some interesting nuances in the words they used.
Here are the most significant contrasts:
The most dramatic distinction between the two had to do with emotional tone. Trump’s words had almost twice the emotional content of Cuomo’s, including words like “nice”, although maybe the use of that word maybe should not be taken at face value.
Trump also spoke of rewards/benefits and money about 50% more often than Cuomo.
Trump emphasized allies and friends about twenty percent more often than Cuomo.
Cuomo used words that evoked health, anxiety/pain, home and family two to three times more often than Trump.
Cuomo asked more than twice as many questions, although some of these could be sort of rhetorical – like “what do you think?”
However, Trump was 50% more tentative in his declarations than Cuomo, whereas Cuomo had greater expressions of certainty than Trump.
While both men spoke about the present tense much more than the future, Cuomo’s use of the present was greater than Trump’s. On the other hand, Trump’s use of the future tense and the past tense was greater than Cuomo’s.
Trump used “we” a little more often than Cuomo and much more than he used “you”. Cuomo used “you” between two and three times more often than Trump. Trump’s use of “they” even surpassed his use of you.
Distinctions of this kind are never crystal clear, even with sophisticated text analytics and machine learning algorithms. The ambiguity of human speech is not just a problem for machines, but also for people communicating with each other.
But these comparisons from text analytics do provide some semantic evidence for the comments by non-partisan observers that Cuomo seems more in command. This may be because the features of his talks would seem to better fit the movie portrayal and the average American’s idea of leadership in a crisis – calm, compassionate, focused on the task at hand.
I noticed that the White House unveiled today its proposal for many changes in US taxes. I don’t normally comment on current political controversies and am not going to do so now, whatever my private views are on the policies.
But, as someone with an interest in 21st century technology, I did take notice of one thing about the proposal that I’ll comment on– admittedly something not as important as other aspects of the plan, but something that seems so outdated.
It is another example of how, for all the talk about technology and change, far too many people – especially public officials – are still subject to what the media expert Marshall McLuhan called the “horseless carriage syndrome”. When automobiles first getting popular a hundred years ago, they were seen as carriages with a motor instead of a horse.
Only much later did everyone realize that the automobile made possible a different world, including massive suburbanization, increased mobility for all generations, McDonald’s and drive-ins (for a couple of decades anyway), etc. Cars were really more than motorized, instead of horse-driven, carriages.
Similarly, tech is more than the sometime automation of traditional ways of doing things. Which brings me back to taxes.
In pursuit of a goal of simplifying the tax system, the White House proposed today to reduce the number of tax brackets from seven to three.
(This image is from the NY Times.)
And that brings me to a question I have previously asked: why do we still have these tables of brackets that determine how much income tax we’re supposed to pay?
The continued use of tax brackets is just another example of horseless carriage thinking by public officials because it perpetuates an outmoded and unnecessary way of doing things.
In addition to being backward, brackets cause distortions in the way people make economic decisions so as to avoid getting kicked in a higher tax bracket.
But we no longer have to live in a world limited to paper-based tables. Assuming that we don’t go to a completely flat single percentage tax – and even the White House today doesn’t propose that – there is nothing in a progressive tax that should require the use of brackets. Instead, a simple system could be based on a formula which would eliminate the negative impacts of bracket-avoiding behavior that critics of progressive taxation point to.
And all it would to implement this is an app on our phones or the web. An app could the most basic flat tax formula, like “TaxOwed = m * TaxableIncome” where m is some percentage. It could also obviously handle more complicated versions for progressive taxes, like logarithmic or exponential formulas.
No matter the formula, we’re not talking about much computing power nor a very complicated app to build. There are tens of thousands of coders who could finish this app in an afternoon.
Again, the reduction of tax brackets from 7 to 3 is not among the big issues of the proposed tax changes. But maybe we’d also get better tax policies on the big issues from both parties if public officials could also reform and modernize their thinking – and realize we’re all in the digital age now.
Next week, I’m teaching the summer semester version of my Columbia University course called Analytics and Leading Change for the Master’s Degree program in Applied Analytics. While there are elective courses on change management in business and public administration schools, this combination of analytics and change is unusual. The course is also a requirement. Naturally, I’ve been why?
The general answer is that analytics and change are intertwined.
Successfully introducing analytics into an organization shares all the difficulties of introducing any new technology, but more so. The impact of analytics – if successful – requires change, often deep change that can challenge the way that executives have long thought about the effect of what they were doing.
A good, but early example of the impact of what we now call “big data”, goes back twenty-five years ago to the days before downloaded music.
Back then, the top 40 selections of music on the “air” were based on what radio DJs (or program directors) chose and, beyond that, the best information about market trends came from surveys of ad hoc observations by record store clerks. Those choices too emphasized new mainstream rock and pop music.
In 1991, in one of the earliest big data efforts in retail, a new company, SoundScan, came along and collected data from automated sales registers in music. What they found went against the view of the world that was then widely accepted –
and instead
old music, like Frank Sinatra, and genres others than rock were very popular.
A somewhat more recent example is the way that insights from analytics have challenged some of the traditional assumptions about motivation that are held by many executives and many staff in corporate human resource departments. Tom Davenport’s Harvard Business Review article in 2010 on “Competing on Talent Analytics” provides a good review of what can be learned, if executives are willing to learn from analytics.
The first, larger lesson is: If the leaders of analytics initiatives don’t understand the nature of the changes they are asking of their colleagues, then those efforts will end up being nice research reports and the wonderful insights generated by the analysts will disappear without impact or benefit to their organizations.
The other side of the coin and the second reason that analytics and change leadership are intertwined is a more positive one. Analytics leaders have a potential advantage over other “change agents” in understanding how to change an organization. They can use analytics tools to understand what they’re dealing with and thus increase the likelihood that the change will stick.
For instance, with the rise of social networks on the internet, network analytics methods have developed to understand how the individuals in a large group of people influence each other. Isn’t that also an issue in understanding the informal, perhaps the real, structure of an organization which the traditional organization charts don’t illuminate?
In another, if imperfect example, the Netherlands office of Deloitte created a Change Adoption Profiler to help leaders figure out the different reactions of people to proposed changes.
Unfortunately, leaders of analytics in many organizations too infrequently use their own tools to learn what they need to do and how well they are doing it. Pick your motto about this – “eat your own lunch (or dogfood)” or “doctor heal thyself” or whatever – but you get the point.
As internet bandwidth gets better, we’re seeing more people communicate by videoconference. Facetime and Skype are now quite common ways for family members and friends to see each other when they are physically separated. Video conferencing has been around for a while in big multi-national corporations.
More slowly, we’re also seeing the adoption of videoconferencing for meetings of public bodies. There are various reasons for the slow adoption.
Public agencies are unfortunately often populated by the less technically savvy part of the overall population or whose members are just more comfortable with physical than virtual face-to-face communication.
Or their members may have had a bad experience with video conferencing a few years ago, when neither the software nor the bandwidth was sufficient to become invisible and not interfere with free-flowing conversation. (Of course, there are still examples of bad videoconferences even now. I mention video products I’ve used successfully below.)
Public bodies are also subject to various open meeting laws and rules, which haven’t always caught up with changes in technology.
But things are changing, so I went on a search of the video conferencing practices of government bodies. Here’s some of what I found.
A bigger step is virtual attendance at meetings of the members themselves.
This is important especially when distances are large or obtaining a quorum is hard to achieve. It may be unusual for a city council or a state legislature to fail to have a quorum, but there are tens of thousands of other public bodies who can’t get their business done because not enough members can show up. This affects school boards and libraries and water districts and state advisory boards, etc. (By the way, the problem isn’t new – the first session of the US Congress was delayed for some time while members arrived very slowly.)
Video conferencing that would enable members to participate remotely would seem to be a natural solution. But as in all other aspects of the public sector, things aren’t so simple and policies seem to be the first obstacle to overcome.
Over the last few years, the Wyoming’s State Legislature has developed its video policy, which still seems be somewhere in the middle between those without full confidence and those who want to use it. Approvals are needed for committee use of video, as the policy states: “With prior consent of the committee chair, a video conference may be held for legislators unable to attend a meeting at the official meeting location.” More generally, “An entire committee can meet via video conference at the direction of the chairman.”
Since 2013, the State of Missouri has allowed those elected to public bodies (mostly local) to vote and participate by video. But the Missouri Municipal League felt it necessary to a model policy for videoconferencing. It particularly emphasizes such guidance as: “a member’s use of video conference attendance should occur only sparingly.”
Following a change in the Illinois Open Meetings laws, the Schaumburg Library in 2009 adopted an official policy on this subject. They require a quorum of members at the physical meeting, not counting members participating via electronic means. Once that quorum is established, the remote participants have full rights although their votes are recorded as being remote. The policy also lists the acceptable reasons for wanting to participate remotely – employment, board business, illness, or family emergency.
In Texas, school boards also can use videoconferencing, but with somewhat similar requirements for a quorum. Whereas, public bodies in Pennsylvania can count remote participants as part of the quorum.
The State of Florida has empowered condo boards, which are a major form of local governance there, to use video. The State allows board members to be counted as present and vote remotely via video conferencing.
In New York State, which has some of the strictest open meeting laws, the State has allowed members to participate in meetings by video, but not phone conference calls. The idea is that, as in a traditional physical meeting, everyone has to be able to see all members’ reactions at all times.
In addition, New York State looks on video participation as a remote extension of the physical meeting, so public bodies using video must list all locations in its public notices – both the main physical meeting as well as any location where a member is using video. Presumably, someone in a hotel in, say, Florida or France, would have to allow any interested citizens to come into their room and also see what’s going on.
I’m on a number of public boards and they have different policies. Some boards are reluctant to use video at all. Another board has just had a completely virtual meeting that worked very well using Fuze and will be repeating this at least twice a year. I’ve also used Zoom successfully for meetings with large numbers of people.
Like most adoption of technology, transitions are not smooth and the old and the new exist together. In the streets of cities a hundred years ago, there were accidents between automobiles (then still relatively new) and horses pulling carriages.
Why should we expect video conferencing to be different?
The book, “Team of Teams: New Rules of Engagement for a Complex World”
by retired General Stanley McChrystal and his associates Tantum Collins,
David Silverman, and Chris Fussell has been out for
more than a year. I hadn’t gotten around to reading it partly because I
wasn’t sure I wanted to read what I thought would be yet another
general’s exercise in self-promotion. I’ve also been through too many
conferences filled with speeches from high ranking executives that are
essentially war stories in which they are the heroes of the story.
So, when I finally had the time to read it at the end of last year, I was surprised to find that this book is one of the best recent books on management. It has been criticized by some as not really having anything new in it and merely reflecting the undue length of time it has taken a general to figure out these things.
While there is some truth to that, the fact remains that most large corporate and public sector organizations operate in the old style that McChrystal finds inadequate for a new era of change, complexity, and creativity. This includes even highly touted tech companies who reach a certain size and stage of maturity, even while they profess to be using agile approaches.
For General McChrystal, it’s a question of what the organization is designed to achieve. Traditional “Taylorism”, which has been the model of most large organizations, aims to maximize efficiency. As part of that goal, he writes “organizations have implemented as much control over subordinates as technology physically allowed.” That certainly sounds like the traditional image of the Army and many large corporations.
Instead, he argues that in today’s world, adaptability is much more important. This is a necessary response to deep and widespread technological changes. He also notes that those same technologies make possible a more modern, more adaptable organization.
Although much of what it’s in the book isn’t exactly new, the authors synthesize the material and lay it out to build a story that should be compelling to any senior executive.
The value of teams and the use of the intelligence of team members, rather than considering them cogs in a large machine, is explained well. But the real challenge in leading large organizations is how to scale those benefits.
That’s where McChrystal and co-authors make a real contribution.
Here are some the key take-aways:
A systems approach and a more organic rather than mechanistic view is needed by leaders when looking at large organizations whose units must work together. Each person in the organization needs to maintain a systemic perspective too.
Frequent inter-team communication – “shared awareness” of the environment that develops into “shared consciousness” – is necessary to prevent teams from doing things that run counter to the needs of the overall organization.
On the latter point, perhaps communication is too weak a word because it implies that each side decides when and what to say. The General found instead that absolute transparency between units (and teams) was necessary. And, as he noted, “In traditional organizations, this constitutes culture change that does not come easily.”
Although this has been well known to organizational researchers for some time, the practice of using physical space to encourage this kind of approach is not widespread. General McChrystal relates his own and other organizations use of common spaces. Of course, in a world of increasingly virtual organizations it is especially important to create continuously operating virtual spaces, with full video, to achieve the same effect.
Where people from different teams couldn’t be physically next to each other, he set up “embedding and liaison programs to create strong lateral ties between our units, and with our partner organizations. Where systemic understanding mirrors the sense of ‘purpose’ that bonds small teams, this mirrored the second ingredient to team formation: ‘trust.’”
The leader as mastermind or chess master is yet another old concept to be thrown away and replaced by the model of a gardener who enables the ecosystem rather than directing it. We should not “demand unrealistic levels of knowledge in leaders and force them into ineffective attempts to micromanage.”
In order to be able to react with necessary speed to ever changing situations, organizational leaders need to abandon traditional control because “Individuals and teams closest to the problem, armed with unprecedented levels of insights from across the network, offer the best ability to decide and act decisively.”
This book is an excellent guide to effectively managing large-scale operations to implement a strategy. But, much like the wars that General McChrystal was part of, it doesn’t focus on whether the larger strategy makes sense. That’s not a criticism of the book, just a realization that there are important considerations beyond its scope.
In previous elections, prediction markets were relatively accurate and were touted as competitors to public opinion polling. So how did they do this time?
The Iowa Electronic market had two prediction markets concerning the Presidential election. One was for the percentage of the popular two-party vote, which over the course of betting predicted Clinton 50% and Trump 48%. [These were individual contracts, which may be why the numbers add up to more than 100.] According to the most recent actual vote count, the result of the two-party split was Clinton 51% and Trump 49%.
The other was for the winner of the popular vote, which over the course of betting was 97% for Clinton and 1% for Trump. This was correct as current estimates show her getting over two million more votes than him.
Alas, winning the popular vote wasn’t enough this time and this was where the prediction markets seem to have run into a problem.
In one of the few markets that focused on electoral votes, a German betting market ended up predicting Clinton 300, Trump 237. (The real result was almost the reverse.)
PredictWise’s betting market had Clinton “winning” with an 86% probability. (In their defense, of course, that also means a 14% chance for Trump, which has to happen some time if we’re talking probability, not certainty, after all.)
“Polls aren’t perfect, but neither are political betting markets. Since these markets have gained credibility in predicting elections, they have started taking changes in public opinion polls less seriously. Overconfidence in betting markets makes the markets look misleadingly stable, and that false sense of stability makes it harder for them to predict events that shake up the status quo — such as the outcome of the Brexit referendum, or Trump’s success in the Republican presidential nomination process. As Rothschild himself has pointed out, ‘prediction markets have arrived at a paradoxical place: Their reliability, the very source of their prestige, is causing them to fail.’ ”
In looking at these markets and, more generally, crowd predictions of events, it’s worth going back to James Surowiecki’s book, “The Wisdom of Crowds”. He described both the rationale for prediction markets — which have been well publicized — and the characteristics of accurate prediction markets — which have received less emphasis.
“The premise is that under the right circumstances, the collective judgment of a large group of people will generally provide a better picture of what the future might look like than anything one expert or even a small group of experts will come up with. … [Prediction markets] work much like a futures market, in which the price of a contract reflects the collective day-to-day judgment either on a straight number—for instance, what level sales will reach over a certain period—or a probability—for example, the likelihood, measured as a percentage, that a product will hit a certain milestone by a certain date.”
“[F]or a crowd to be smart, it needs to satisfy certain criteria. It needs to be diverse, so that people are bringing different pieces of information to the table. It needs to be decentralized, so that no one at the top is dictating the crowd’s answer. It needs to summarize people’s opinions into one collective verdict. And the people in the crowd need to be independent, so that they pay attention mostly to their own information and don’t worry about what everyone around them thinks.”
Did the prediction markets in 2016 meet Surowiecki’s criteria? Not really.
One problem with betting markets is that they are not diverse, not representative of a broad spectrum of the population. As a CNBC report noted: “Another issue that may have contributed to the miss [on Brexit and now the US election] is the relatively similar mindset among bettors generally.”
Since all bettors can see what others seeing, it’s hard to argue that their judgments are independent. And while, in a way, the decisions are decentralized, to the extent they mirror the current polling results and news reports from national media, there is less decentralization.
So do we just decide that the results of this year’s election call into question the value of crowd predictions? I think not.
But rather than focusing on predicting who wins the White House or the Super Bowl or the number of coins in a large bottle, there is another use of prediction markets for business and government leaders — testing the likelihood that people will respond positively to a new program or offer.
No matter how much market research (aka polling) is done, it is often difficult to assess how the public will react to a proposed program. I’m suggesting that prediction markets be used to estimate the reaction ahead of time, as long as they match Surowiecki’s criteria and don’t depend on money bets. At the very least, this would require a large and diverse set of people responding and keeping their judgments secret (until “voting” stops).
Over the last year or so, there have been several reports that rates for Affordable Care (aka Obamacare) had to be raised because there are fewer young, healthy people enrolling than expected. Putting aside the merits of the policy and its goals, this is an ideal case where prediction markets could have helped assess the accuracy of an underlying assumption about the implementation of a very consequential piece of public policy.
Some experts are skeptical of prediction markets because the average person doesn’t have professional expertise. But this use of prediction markets draws on the perceptions of people about each other.
Implicit in the diversity of views that Surowiecki notes is that enough people need to care about the planned program or policy. The reason they care may be to win money, in some cases, but that’s not the only reason. They might care because the market deals with something that affects their lives.
And the nice thing about this is that if only a few people care about a planned program that also tells you something about that plan — or, at least, whether the range of outcomes might be something between a yawn and deep trouble.
It may well be that this more experimental basis to predict behavior will illustrate the deeper value of prediction markets. What do you think?.
This is a follow up to last week’s post about people in positions of power whose decisions are flawed because of that powerful position.
Almost
every President relishes his image as a decision maker. In the current
election, there’s also much talk about temperament, with both major
candidates claiming how good they are at making judgments and decisions.
But
there’s little discussion about whether – out of ego, ambition, policy
concerns or whatever – they end up trying to make too many decisions.
Huh? Isn’t that what the job is all about?
That’s what you would
believe if you listened to candidates and President. It’s almost as if
they are like baseball players toting up how many hits they’ve had this
season – why I made 1,000 important decisions last year!
Many academics also focus on Presidential decision-making. Here’s a statement for students:
“Can
you imagine being the president of the United States? Think about all
the important decisions that must be made. A president must exercise
wise decision-making skills. Decision making is simply the thought
process of selecting a logical choice from the available options. For
the president, the available options must seem endless!”
John Dean, famously, formerly on the staff of President Nixon, writing just a few years ago about President Obama, stated:
“Nothing
is more important in the American presidency than decision-making. It
is, in fact, the very essence of the job. Presidential decisions can
and do shape our history, for better or worse. Rarely, though, does the
decision-making style of presidential candidates receive much attention
during a campaign.”
Well, on top of the flaws in each individual decision, things only get worse when someone is making too many decisions.
When I originally wrote about this in 2011, one of the most popular articles on the New York Times website was John Tierney’s “Do You Suffer From Decision Fatigue?”. (It’s still one of the top hits when you search the subject.)
He
pointed out how the quality of decisions declines as too many are made,
in part because the decision makers have not conserved their willpower
for the tough decisions. He cited a now frequently cited study of
parole decisions:
“[A]s researchers discovered by analyzing more
than 1,100 decisions over the course of a year, Judges, who would hear
the prisoners’ appeals and then get advice from the other members of the
board, approved parole in about a third of the cases, but the
probability of being paroled fluctuated wildly throughout the day.
Prisoners who appeared early in the morning received parole about 70
percent of the time, while those who appeared late in the day were
paroled less than 10 percent of the time.”
This pattern is a reflection of decision fatigue,
trying to make too many decisions. It is tied to the general limit on
each person’s ability to sustain will power (and, for that matter,
rationality) over the more natural emotional instincts as the day goes
on.
The American Psychological Association has a website
devoted to will-power – the ability to make decisions that are based on
long-term, rational goals rather than immediate gratification. While
elaborating on the various ways that having stronger will-power leads to
lives that are more successful, they also note the numerous studies
that show it is a limited resource which can be depleted after a series
of difficult decisions.
You can find all sorts of self-help
articles about how to boost your will power, including eating more to
overcome low glucose periods of the day. FastCompany magazine even credited President Obama with reducing his decision fatigue by wearing the same suit every day.
Notwithstanding
the best efforts of even President Obama, the demands on public
officials – Presidents/governors/mayors, even legislative bodies – to
make all kinds of decisions explains a lot of some of the otherwise
inexplicable decisions we’ve observed.
As we face another Presidential election and think about the
candidates operating in the well-known bubble of the White House, I
thought it worth updating and reposting a piece from four years ago, a month before the last election.
The question I asked: Are our public leaders flawed because they were selected as public leaders?
Just a few weeks ago, an article
in Fortune reminded me of this question and the phenomenon that answers
the question. Its author, Rita Gunther McGrath, noted that:
“In
almost every disaster, you find the leaders based their decision-making
on assumptions… A fundamental flaw in most governmental policy-making
is that those making the deals and decisions think they are operating
with facts. The reality is that they are operating instead with
assumptions, many deeply held, about what causes what to happen. A
policy is really a statement of assumed causality, and the law of
unintended consequences is ever-present.”
The downside of a chief executive’s view of reality – i.e.,
assumptions – is made worse by the typical over-confidence such
positions encourage.
The popular title and sub-title of the paper
by Professor Kelly E. See of NYU and three other academic researchers
on organizational behavior, which I originally cited, make the point:
“The Decision-Making Flaw in Powerful People: Overflowing with
confidence, many leaders turn away from good advice.”
Some of their key findings:
“This
paper finds a link between having a sense of power and having a
propensity to give short shrift to a crucial part of the decision-making
process: listening to advice. Power increases confidence which can
lead to an excessive belief in one’s own judgment and ultimately to
flawed decisions. …
"In addition to confirming the previous
experiments’ finding that more powerful people were less likely to take
advice and were more likely to have high confidence in their answers,
this final experiment showed that high-power participants were less
accurate in their answers than low-power participants.”
A related paper by a different group of researchers, led by USC Professor Nathanael J. Fast adds some nuance to this finding:
“Experiencing
power leads to overconfident decision-making. The findings, through
both mediation and moderation, also highlight the central role that the
sense of power plays in producing these decision-making tendencies.
“First,
sense of power, but not mood, mediated the link between power and
overconfidence. Second, the link between power and overconfidence was
severed when access to power was not salient to the powerful and when
the powerful were made to feel personally incompetent in their domain of
power.
“These findings indicate that only when objective power
leads people to feel subjectively powerful does it produce overconfident
decision-making.”
Unfortunately, the last finding doesn’t much
change the fundamental situation for Presidents, who are extraordinarily
powerful, except maybe when they deal with scientific issues that are
not part of their self-image – and, even then, the position lends
greater credence to their views than may be warranted.
Professor See and colleagues provided some advice about overcoming this problem:
"For
one thing, organizations could formally include advice gathering at the
earliest stages of the decision-making process, before powerful
individuals have a chance to form their own opinions. Encouraging
leaders to refrain from commenting on decisions publicly could also keep
them from feeling wedded to a particular point of view.”
Whether
or not you might find this research conforms to your own experience, the
last point — gathering in lots of information before public leaders
decide — is a reasonable and feasible suggestion to improve decision
making in many cases. Today, the Internet and the collaborative
discussion tools it offers can make this happen fairly easily.
The
question is whether the next President will put in place that kind of open platform
for advice or wrongly trust the assumptions that she/he brought into the Oval
Office.
A part of my research in graduate school included modeling a small,
but influential, network of individuals – the US Supreme Court. I used
the mathematical models tools available. I even represented the court’s
decisions in a Markov chain and computed characteristics like its
eigenvalue.
You can be excused if you’ve never heard about any of
this or even about Markov chains. Nobody at the time was much
interested either. But I suppose I should have stayed with it, with
books now being published on the impact of the Internet and network
analysis.
It emphasizes the importance of networks and
declaring that there is still a wide-open gap in the tools most of us
have for understanding these networks.
In an interview about the book, he set out his goal:
“We
live in an age where almost everything changes because of
connectivity… The seventh sense is the idea that some people have an
instinct for how this works that’s better, sharper than the rest of us.
The book is designed to teach people how to think about connected
systems so that they can have the same kind of edge. The people who see
what’s coming in financial markets or in politics have that edge. It’s
important that the rest of us develop it, too.”
However, the book
is worth reading for what it is, not what he wants it to be. It is
unusual in probing the subtleties — both positive and negative — of our
network age, not the usual breathless or self-promoting material.
Most
of the book describes the various ways that being connected can change
the characteristics and behavior of businesses, organizations,
governments – everything that we’ve inherited from the industrial era.
Much
has been made in various other reviews and discussions of this book
about its the scary descriptions of security issues and other dangers in
networks. That wasn’t news to me and shouldn’t be news to most network
users who have been paying any attention.
Some people have
complained that the book is so wide ranging and repetitive it can be
frustrating to read. Parts go into related space, where he worries
that it’s not just the network, but artificial intelligence that is
surpassing us in ways we don’t understand. But this isn’t a blog of
literary criticism, so I’ll skip over that and go to the substance.
Considering
his day job at Kissinger Associates, I thought the most interesting
themes had to do with the interaction between the new global technology
network and the traditional institutions of government, business and
society.
Two themes, in particular, stand out:
Ramo
notes that the transition from agricultural to industrial eras was
accompanied by major wars, revolutions and destruction, along with
rising wealth. He asks what similar events are likely to happen in the
transition to a networked age. Perhaps ISIS and this year’s disruptions
in the American Presidential elections are only early warning signs of
what’s to come.
He ends the book recalling Plato on the
need for wisdom in rulers, after he has presented a picture of two
inadequate sets of rulers – the engineers who control the network, but
do not understand governance and human interactions and the traditional
government leaders who don’t understand then network.
Although
we all seem to be connected, Ramo writes that the Internet is really
divided into various gated communities. He states that “gatedness is
the corollary to connectedness” and this gatedness is a potential
problem.
At one point, he worries that you will have to be among
the rulers — presumably those with the seventh sense or at least those
controlling the gates — or the ruled. He says the network gives people
more power against the gatekeepers than in traditional institutions, but
also notes that the average person may nevertheless need to be inside
the gate to lead a satisfactory life and make a living – so there’s
really no choice after all.
Aside from the problem he mentions, why is this important?
Well,
no matter their ideology and internal practices, in the past few
centuries, all governments are fundamentally in the business of
controlling a specific bordered territory — maintaining the physical
gates. He posits that the Internet’s gatekeepers — Facebook, or Apple
iOS, etc. —are taking over that role in the cyberworld. He says that
they are the powerful ones to watch out for in future wars between
networks and the state and between networks and other networks.
Many
others have considered the potential of a conflict between governments
and the Internet. Last summer, for example, the Wilson Quarterly had an
article responding to this concern, “The Nation-State: Not Dead Yet”.
The
biggest weakness in the book and others of this kind is that the lack
of nuance in the discussion of networks. The fact that there can be a
distribution of power and gates in networks doesn’t end the story.
Partly the problem with these books is that the question of what nodes
(and entry points) of a network are most influential isn’t one that
can’t be answered merely in words.
Pictures help convey a bit more, and – going back to my graduate school research – mathematics helps even more.
So
as you read Ramo’s book and his concerns, you get the sense that his
view of the network is similar to this picture of Indiana University’s
Big Red network:
But perhaps the world outside of such tightly
controlled campuses is more like the collaborative network of Oak Ridge
National Lab:
Or something different.
And although a node’s
place in a network can show its potential influence, these graphs merely
show connections, not actual influence or power. Unfortunately, the
publicly available analysis of influence over the billions of nodes and
endpoints of the Internet is still primitive. Moreover, to his point,
it is also changing.
This book is a bit like Jefferson’s view of
the Louisiana Purchase before the Lewis and Clark expedition. Jefferson
had a sense it was worth buying, but needed to send out scouts to find
out the details. While they didn’t learn everything there was to learn
about the territory, much of what they did learn was changed over time
anyway.
That too will characterize our understanding of the network we explore each day.
At the annual summit of the Intelligent Community Forum two weeks
ago, there was a keynote panel consisting of the mayors of three of the
most intelligent cities in the world:
Michael Coleman, Mayor of the City of Columbus, Ohio from 2000 through 2015
Mayor Rob Van Gijzel, Eindhoven, Netherlands, from 2008-today
Paul Pisasale, Mayor, City of Ipswich, Queensland, Australia, from 2004-today
Both Eindhoven and Columbus have been selected as the most intelligent community in the world and Ipswich has been in the Top 7. Columbus also was just selected by the US Government as one of the winners of its Smart City challenge.
The
topic was intriguing (at least to those of us who care about economic
growth): “International Economic & Business Development — Secrets of
international development at the city and region level”.
They
did have interesting things to say about that topic. Mayor Coleman
pointed out that 3,000 jobs are created for every billion dollars of
global trade that Columbus has. He reminded the audience that making
global connections for the benefit of the local economy is not a
one-time thing as it takes years to build relationships that will
flourish into deep global economic growth.
That reminder of the
long term nature of creating economic growth was a signal of the real
secrets they discussed — how to survive a long time in elected office
and create a flourishing city.
Part of what distinguishes these
mayors from others is not just their success at being elected because
the voters thought they were doing a good job. An important part of
their success is their willingness to focus on the long-term, the
future.
By contrast, those mayors and other local officials who
are so worried about re-election instead focus just on short term hits
and, despite that, often end up being defeated.
This requires a
certain personal and professional discipline not to become too easily
distracted by daily events. For example, Mayor Coleman said he divided
his time into thirds –
Handling the crisis of the day (yes, he did have to deal with that, just not all the time)
Keeping the city operations going smoothly
Developing and implementing a vision for the future
In
another statement of the importance of a future orientation, Mayor
Pisasale declared that “economic development is about jobs for your
kids” — a driving motivation that’s quite different from the standard
economic development projects that are mostly sites for ribbon cuttings
and a photo in the newspaper.
He was serious about this statement
even in his political strategy. His target groups for the future of the
city are not the usual civic leaders. Rather he reaches out to
students (and taxi drivers) to be champions for his vision of the
future.
Mayor Van Gijzel pointed out that an orientation to the
future means that you also have to be willing to accept some failures –
something else that you don’t hear often from more risk-averse, but less
successful politicians. (By the way, there’s a lot more detail about
this in the book, “The City That Creates The Future: Rob van Gijzel’s
Eindhoven”.)
This kind of thinking recalls the 1932 declaration by the most
politically successful and re-elected US President, Franklin Roosevelt:
“The
country needs and, unless I mistake its temper, the country demands
bold, persistent experimentation. It is common sense to take a method
and try it: If it fails, admit it frankly and try another. But above
all, try something.”
That brings up another important point
in this time of focus on cities. Innovation and future-orientation is
not just about mayors.
Presidents aside, another example of long term
vision comes from
Buddy Villines, who was chief executive of Pulaski County (Little Rock, Arkansas) for twenty-two years until the end of 2014.
At
a time when many public officials are disdained by a majority of their
constituents, these long-time mayors – successful both as politicians
and for the people of their cities – should be a model for their more
fearful peers.
In 2009, I wrote a blog titled: “When Will Citizens Be Able To Track Requests To The Government?”
It’s time to see if much progress has been made, but first some background …
The
people that public officials call citizens or voters or residents are
not single-minded civic machines. Most of the time, they are consumers
and workers outside of the public sector. And so what happens outside
of the public sector affects their expectations of what should happen in
the public sector.
One of the more frequent parts of a
consumer’s life these days is being able to track things. Here are just
a few of the many diverse examples, almost all of which have been
around for at least a few years: track your Domino’s pizza
order from the oven to your front door; track shipments, at all stages,
through FedEx or UPS or even USPS; track the path of a car that you
ordered via Uber; track an airline flight so you know when to leave for the airport to pick up a relative or friend.
Why
not enable citizens to track their government transactions in
mid-stream? While suggestions of this kind are often proposed to
increase transparency of government, the tracking actually serves a much
simpler goal – to reduce frustration on the part of the citizen.
If
people can see where their request or application is, they will have a
lower sense of frustration and a greater sense of control. If the
citizens could also get an estimate of how long it usually takes to go
through each step of an approval process, all the better.
In the public sector, this kind of tracking was very rare in 2009. The standout was the UK, for example enabling residents to driving license applications.
Since
2009, we’ve seen some more ways to track requests and applications.
This has been especially true of requests under various freedom of
information laws, such as the US Justice Department’s. However, the
average citizen is not submitting FOIL requests – I suspect that most
come from media employees.
You can track your request for US government grants
– again something that the average citizen isn’t focused on. The US
Internal Revenue Service IRS2go app lets you track the status of your
refund, which is likely to be of interest to a much larger number of
people.
While it is difficult for me to judge from this distance
how well it actually works, certainly one of the broadest and most
ambitious efforts to let residents track their requests is in India, not the US or Europe even.
Alas, in New York City, the government’s website tells you call 311 to track applications for Food Stamps.
In South Carolina, a “Multi-Agency Partnership Portal”
provides a reasonably good way of applying for various health and
support programs. Although the website refers to seeing the status of
the application, it’s not clear from the documentation how you’d do that.
Colorado’s version of the same kind of website, called PEAK, makes it very easy to track status.
Although Indiana also does this, its website seems much more complicated than Colorado’s.
Even
the City of San Francisco, which aims to be a technology leader, has
had its difficulties in enabling people to do the simple tracking of,
for example, building permits. Its website refers back to a partial implementation two years ago, but no recent update.
Even worse, one of the examples from 2009 was from the District of Columbia, where you could the track the status of building permit applications. If you try that now, you’ll get this backtracking message:
“DCRA
has removed its permit status check page also known as Online Building
Permit Application Tracking (OBPAT) application from its website.
"DCRA
recognizes that some constituents are disappointed about this
decision. In short, DCRA found that-the information was too often
unreliable and resulted in misinformation to constituents. This is
totally unacceptable, DCRA is hopeful that the site will eventually be
restored, but the data issues must be resolved before it is. DCRA is
committed to transparency, but transparency is helpful when accurate
information is available. It is DCRA’s goal to have truthful, accurate
communication from staff, and the public access sites need to reflect
that as well.”
Clearly, there are still many situations where people want to track their interaction with the government and cannot.
(Of
course, the ultimate goal, in so far as possible, is to complete those
transactions instantaneously online, like the fishing license app that Michigan makes available. Then the tracking problem disappears, but that’s a subject for a future blog post.)
So the answer to the question?
In
the last seven years, there has only been a little progress here and
there in some areas of government, but not the massive change that
technology makes possible.
Consider an analogy. While
every government, for instance, expects that it needs a formal budget
document, most apparently don’t yet have an expectation that they need
to make it easy for people to find out the status of their requests for
common services. In this Internet age that is no longer something new. It’s time to get moving on it.
We’ve just passed the tax deadline and reflecting on it I was vexed
again by this question: why do we still have these tables of brackets
that determine how much income tax we’re supposed to pay?
I can
understand there was a time, many decades ago, that the government
wanted to keep things simple so each person could easily determine the
tax rate that would apply. And I know that the continued use of tax
brackets is not the biggest problem around. However, tax brackets are
just another symptom of government’s failure to see the widespread
deployment of technology in the public and its failure to use basic
technology for simple improvements that are appropriate in this century.
Brackets
cause some problems. Politicians who advocate a single flat tax rate
often start with the argument that their approach would be so simple
people could just send in a postcard. Putting aside the merits or
demerits of a flat tax, for the moment, there is something retro about
telling people to use a postcard in 2016.
From 2000 to 2015, postcard usage dropped by more than two thirds, an
even greater drop than in first class envelope mail. The Washington
Post even had a story last year with a headline that asked “Are postcards obsolete?”
Where would we even find these postcards? Would the IRS mail them to us? 🙂
Those
who argue for flat taxes or lower taxes in the higher brackets
implicitly say that people will work less if it means an obvious jump in
tax rates by shifting into a higher bracket. There are also those who
advise people how to avoid this problem, as did a Forbes magazine article
last month which started out saying that
“the key tax challenge facing
retirees: being helplessly catapulted into rising tax brackets [because
our] tax code is progressive.”
Indeed, with the current set of progressive tax rates, your percentage of tax goes up as your income goes up.
But we no longer have to assume we live a world limited to paper-based tables.
There
is nothing in today’s world that requires the use of brackets in a
progressive tax system. Indeed, a system based on a formula instead
would eliminate the negative impacts of bracket-avoiding behavior that
critics of progressive taxation point to.
There are a few possible
formulas that might work. The most complex would be a logarithmic or
exponential curve, which a computer can nevertheless easily compute. If
you want to make it even simpler, another formula would set the
percentage tax rate as a percentage of income. (Remember school math?
TaxRate = m * Income where m is some small fraction.)
No matter
the formula, computers can handle it. The IRS could make a formula
available on line or over the phone — just enter your taxable income and
it will tell you what you owe. It can be built into the calculator
function of cell phones. There are tens of thousands of coders who
could finish this app in an afternoon.
Of course, the IRS says that it now offers an app, but it doesn’t take advantage of the computing power of the mobile device nor help you figure out the amount you owe.
While
we’re at the effort to bring government into the modern technological
era, let’s also consider where those taxes go. Why do we still have
fixed budgets?
The budget reform of the 1920s was developed in a
world that did not have the ability to dynamically make calculations.
So every year, government officials make their best guess on the
condition of the economy, the demand from an unknown number of
potentially needy citizens and other factors that determine the ebb and
flow of public finances. Since the budget process is lengthy, they make
this guess well ahead of time so they could be trying to predict the
future more than 18 months ahead of time.
A rolling budget would
work better by automatically adjusting each month to the flow of revenue
and the demands on government programs — and all you need is a big
spreadsheet on a not-so-big computer. However, the budget makers would
have to decide what their priorities are. For example, for every
percentage of unemployment, we need to put aside $X billion dollars for
unemployment insurance payments. It would take work to do this for each
of the promises the government makes — although maybe not as much work
as trying to guess the future.
(Of course, the real obstacle to a
rolling budget model is that policy makers would be forced to make more
explicit their priorities.)
I could go on, but you get the idea.
Buying billions of dollars of technology products is not enough.
Government needs also to bring technology into its thinking and design.
Open source software is free and is often developed by collaborations
of volunteer programmers around the world, along with staff at big
companies who find it in their common interest that this software be
maintained.
Open source software has been an enormous success.
Most of the web services on the Internet are delivered by Apache
software on the Linux platform – both open source projects. Indeed,
there are over 25 million open source projects listed on the open source
directory of GitHub.
Despite the billions of dollars spent by the
public sector for software each year, the public sector’s share of
those open source projects has been much smaller than its share of the
overall economy. This small commitment to open source is not what you’d
expect considering the situation of the public sector compared to the
private sector.
First, unlike private companies, governments
cannot really consider their software to be proprietary and a key
strategic advantage in competing with others. Taxpayers have already
paid for the software and, like much else in government, it is supposed
to be open and available (unless security or privacy is at stake).
Second,
there are many programmers who would help to build and maintain public
sector software out of civic spirit and a recognition that they too
benefit as citizens.
Code For America
has been an outstanding example, but not the only example, of the
willingness of software developers to help even local governments. Ben Balter has been a champion for public sector open source as well. The GovCode
website proclaims:
“Code for your country! We believe in a government
of the hackers, for the hackers and by the hackers”.
[That’s hackers in
the good sense of creative, expert programmers.]
Third, open
source software is often much less expensive than commercial,
proprietary products and just as often better quality. This is
especially true in the public sector where many of the companies
providing products for various special purposes – like jail management or health –
aren’t very big themselves and thus their products are often constrained
or sometimes buggy.
It’s hard to come by the total
number of software developers in governments or total expenditures for
software development – or even packaged software where open source might
substitute. But considering how many government agencies there are, it
is likely that government at all levels has a lot more developers in total than these
companies.
Of course, in fits and starts, open source has come to government, with 2012-2013 a recent peak of interest.
The US Defense Department issued guidelines
for use of open source in that agency. And the US Federal government
has, officially, been an advocate for open source software – at least
within itself. It is not easy to determine what percentage of software
the Feds have actually shifted from proprietary to open source.
“promote the development and implementation of open
source software solutions within U.S. Federal, state, and local
government agencies.”
Alas, currently it seems to be in a quiescent
state.
More recently, Github has created a platform for public sector sharing. But much of what is being shared is data or best practices, not software.
It’s
worth noting that, despite the efforts of the former DOD CIO and groups
within the General Services Administration, like 18F, too little of that movement
has been the result of leadership from within the government. Most has
come from outsiders offering to help.
Most significant is this
nuance: even when there is interest, government policies on open source
are focused on using open source software developed elsewhere, like
Linux, and not necessarily contributing to that software or creating new
open source software.
A key missing element in the failure to
make open source development the standard approach in the public sector is lack of
collaboration among governments. This is hard to understand in an era
when public agencies are strapped for cash. Each public agency may have
only a small software budget. However, pooling their financial and human
resources will achieve a scale that could allow for the creation of
good, feature-rich software for all of them.
This collaboration
need not stop at national borders. For example, the requirements for
software to manage the vaccination of a population are very much
consistent in the public health agencies of many countries. Moreover,
several trends should make collaboration easier – the shift to cloud
computing, new forms of communications that can ease discussions among
developers and the decreasing cost of building software compared to
years past.
It is now possible for government technologists, with
support from citizens, to truly scale up their open source software
development efforts.
So what are public officials, especially
public sector CIOs, waiting for?
If you have the answer or, better yet,
suggestions on how to make public sector open source more widely
adopted, please let us all know.
Continuing my annual round-up of news you may not have seen … about
politicians, polling and Google, and being smart and/or sympathetic.
Have you ever wanted to know when politicians were telling the truth? Fiona Zublin has proposed that
politicians be required to have on some wearable technology that will
continually assess their performance. As she puts it: “We should be
spying on our leaders instead of them spying on us.”
Part of
what drives a request like that is the feeling that politicians seem to
be increasingly out of touch with the public. Yet, one of the
complaints about politicians is that they are too dependent on polls to
determine what they’ll say and do. Perhaps this contradiction can be
explained by the weakness of the polls they depend on.
In the June issue of Campaigns and Elections magazine, Adam Schaeffer poses the question: “Is it time to pull the plug on traditional polling?”
He touches on just one of the ways that polls are not working, which
is their inaccurate predictions about who will actually vote.
And
if you think polling is off the mark, at least you can count on the value of
the actual election results. But those
too can be easily influenced. It’s been
known for quite some time that the order of names on the ballots has an effect –
perhaps a few percent – on how many votes go to each candidate. With people looking for information about
their candidates online, we now have the situation where WIRED writes that“Google’s Search
Algorithm Could Steal the Presidency”.
Robert
Epstein, a psychologist at the American Institute for Behavioral Research and
Technology who did the study of the effects of Google’s search algorithm provided
more detail in his article, “How Google Could Rig the 2016 Election: Google has
the ability to drive millions of votes to a candidate with no one the wiser” last
week in Politico:
“Google’s search algorithm can easily shift the voting
preferences of undecided voters by 20 percent or more—up to 80 percent in some
demographic groups—with virtually no one knowing they are being manipulated,
according to experiments I conducted recently with Ronald E. Robertson…
“Given that many elections are won by small margins, this
gives Google the power, right now, to flip upwards of 25 percent of the
national elections worldwide…
“What we call in our research the Search Engine Manipulation
Effect (SEME) turns out to be one of the largest behavioral effects ever discovered…
“Because SEME is virtually invisible as a form of social
influence, because the effect is so large and because there are currently no
specific regulations anywhere in the world that would prevent Google from using
and abusing this technique, we believe SEME is a serious threat to the
democratic system of government.”
With all the talk these days
about “smart” this and “smart” that, even “smart” politicians, it’s
worth reading James Hamblin’s piece, “100 Percent Is Overrated: People
labeled smart at a young age don’t deal well with being wrong. Life
grows stagnant.”
Being focused on academic perfection all
the time may be overrated, but some experts see the need to train
children in social skills. A summary of this argument can be found in a
NY Times article last month, “Teaching Social Skills to Improve Grades and Lives”.
“She
is known as Xiaoice, and millions of young Chinese pick up their
smartphones every day to exchange messages with her, drawn to her
knowing sense of humor and listening skills. People often turn to her
when they have a broken heart, have lost a job or have been feeling
down. They often tell her, I love you.”
Perhaps
this also reflects a lack of social skills and empathy on the part of
Chinese political leaders as well. I wonder if they’re also using bad
polling 😉
Ah, a boring subject – government budgets – except that the average American turns over a quarter or so of family income to the budget makers.
And although most taxpayers haven’t thought about it much, to make matters worse, the standard approach that most governments use each year to prepare their budgets is, at least in the USA, almost a hundred years old. Of course, a hundred years ago a budget was the latest reform 🙂
This is just the summary portion of New York City’s latest budget.
Typically, agencies are asked to start planning their budget proposals way ahead of the fiscal year. So it’s possible they could be proposing a spending plan 18 months or more ahead of the actual time they need to deliver services – without knowing all the factors that could change during that time.
Do they know how much snow will need to be removed? How many people will need unemployment insurance? Whether there will be an outbreak of the flu that affects everything from school attendance to public employees being able to work? How much money will there be from income or sales taxes in an economy whose future is not certain?
Is it any surprise that a fixed budget leads to mis-allocation of public funds considering the real problems that might exist at any moment after that budget is approved?
This fixed budget process was developed in an era before readily available computer technology, “big data” and the frequent changes that government has to deal with today.
As I wrote about fixed tax brackets, technology now makes it possible fix the traditional fixed budget. It is no longer the reform it once was – indeed, it stands in the way of running a more efficient and adaptable government today.
There have been variations on the theme, such as performance-based budgeting, zero-based budgeting, etc. But not much has changed about budgeting in most governments for a long time, except that now the budgets are kept on computers instead of printed documents.
All of these approaches, in one way or another, try to match the priorities among the demands on government with its possibly changing revenues.
Perhaps the most interesting innovations have been around priority budgeting. In its 2011 report, titled “Anatomy of a Priority-Driven Budget Process”, about Snohomish County, Washington State, the Government Finance Officers Association (GFOA) summarizes the approach.
This is an especially useful area for citizen input, including the use of web-based collaboration platforms. The average person is much better at defining the relative importance of various outcomes to himself/herself than in understanding the implications of a dollar amount that sits on a line in a budget. Some of the other associations of government officials have In addition to GFOA, the International City Managers Association (ICMA) and the National League of Cities have been trying to educate their members about priority budgeting. They have been working with the Center for Priority Based Budgeting.
Variations of priority-based budgeting have been used in Boulder, CO which ICMA has reported on. It has also been used in Cincinnati, OH among a few dozen other jurisdictions.
An important assumption underlying this more flexible budgeting is that government decision makers cannot foretell the future with precision. So, even the priority-based budget may need to be changed during the course of the year as the public and its leaders learn from what they’ve spent on so far and as new needs arise.
Technology today makes possible a more dynamic approach to managing government finances than in the past because it makes these four key aspects of flexible budgeting feasible:
Identify the cost of delivering each kind of outcome the government has in mind
Prioritize those outcomes through some combination of public values and cost-effectiveness
To get things started, estimate the revenue expected to come in and the volume of demand for each outcome.
Adjust on a monthly basis
This obviously requires some flexibility in the allocation of human resources to. Some aspects of government are not that flexible – for example, you can’t train a new police officer overnight – so there are bound to be some inflexibilities even in this approach, but much less than the entirely rigid traditional approach.
Besides, such a situation might encourage people in government to get creative. If crime is going up, maybe people will realize that not all tasks assigned to police officers require an officer. If crime is going down, maybe there are some on the police force who can work on other things.
But getting more creativity in government is a story for another time. For now, please let me know if you’re aware of more flexible budgeting in the public sector or you want to explore this more for your government.
A few years ago, when my son was a high school teenager, he was totally absorbed in online multi-player games. One day, I heard him talking to his friends during the game (using a form of voice over IP, like Skype). So thinking these might be high school buddies, I asked who he was talking to. He said there was one boy from Korea, another from Mexico and a fourth from Russia.
As I told the chief elected executive of our county at the time, my son’s body was there all day long, but his mind was spending lots of time outside of the county (even the country).
This phenomenon is not limited to teenage boys. People of all ages are generally more attentive to life online than they have ever been before. In the US alone, three quarters of the people use social media
Think about where you spend your “mind-time”.
Not the old philosophical debate about a mind-body problem, but a new digital age version has emerged: a new kind of problem where body and mind are in different places.
Moreover, we are actually in the early days of the Internet because our communications with each other generally are not visual. Without conversational videoconferencing, a major means of communicating fully and building trust is absent from online communities. We’ll really see the impact when those visual tools are more widely used.
This situation poses an increasing challenge for public officials.
With their attention focused in all kinds of places around the globe, people are virtually living in multiple jurisdictions. To which jurisdiction does that person have primary loyalty or interest in? Could they be good citizens of more than jurisdiction? In any case, if their attention is divided, doesn’t that have an impact? What if they just don’t care about local officials and their government?
Some cynical political advisers might well like a situation that reduces citizen attention and engagement since it makes the outcome of elections and lawmaking more predictable. But smarter elected officials realize that eventually a lack of public engagement stands in the way of getting things done. In other countries, lack of engagement, knowledge and trust for the government has led to failure to pay taxes or even physically leaving a jurisdiction forever.
Over the last few decades we’ve seen an erosion of trust in this country as well as the Pew Studies, among others, have shown.
Some people attribute the lower trust to the time people spend online, which they view as another form of Bowling Alone, as Professor Robert Putnam titled his most famous book. If anything, the causality may be the reverse – it might be the case that people seek to be engaged in online communities because their physical communities are no longer as inviting to them as a result of the overall decrease in social capital that Putnam portrayed. But that’s a separate story.
Although this may strike many public officials as something new, the study of virtual communities and their implications go back at least as far as Howard Rheingold’s seminal book on the subject in 1993.
Much of the research that has been done so far would indicate that online communities and physical communities have many characteristics in common – both positive and negative.
Size is a good example. Does a person have a greater sense of belonging to an online community of a few hundred or a physical, offline city of a million?
Unfortunately, there hasn’t been much research or data collection about where people are spending their mind-time and what its implications are, especially for government. For that reason, the Algorithmic Citizenship measure is interesting to follow.
Please let me know if you’re aware of other attempts. And I’ll keep track of the work of the Citizen Ex project.
I’ve written, as recently as a couple of weeks ago, about innovation in government. There are many examples, although many more are needed. Despite – or maybe because of – financial constraints and opposing interests who are ritually stuck in old debates, creativity rules in government as elsewhere.
But I was reminded by readers that public officials – or executives of corporations, for that matter – don’t always know how to create a culture of innovation. In response, I remembered a book published a bit less than a year ago, titled “Creativity, Inc.” by Ed Catmull, the founder and CEO of the very successful animation film studio, Pixar, and now also the head of Disney Animation.
The book is partly a biography and partly about film making. But it is mostly one of the best books on management in a long time. Many reviewers rightfully cite his wisdom, balance and humility and note that this book goes beyond the usual superficialities of most management books.
Catmull talks about how to run a company in a creative business, but it applies to many other situations. It certainly applies to the software and technology business, in general. It also applies to government.
One of the major themes of the book is that things will always go wrong and perfection is an elusive goal, even in companies that produce outstanding work. Leaders need to set the proper frame for all stakeholders.
To put this in the context of politics, a successful elected official I know has concluded that it’s not a good idea to go around (figuratively) wearing a white robe, touting your perfection. As soon as one small spot appears on that white robe, it will be noticed and condemned by everyone. Instead, it’s best to let the public know that you too are human and will make a few mistakes, but those mistakes are in the interest of making their lives better.
Catmull puts it this way:
“Change and uncertainty are part of life. Our job is not to resist them but to build the capability to recover when unexpected events occur. If you don’t always try to uncover what is unseen and understand its nature, you will be ill prepared to lead.” “Do not fall for the illusion that by preventing errors, you won’t have errors to fix. The truth is, the cost of preventing errors is often far greater than the cost of fixing them.”
The last point has a larger message: that success is less about the right way [the process] to fix a problem than actually fixing the problem.
“Don’t confuse the process with the goal. Working on our processes to make them better, easier, and more efficient is an indispensable activity and something we should continually work on— but it is not the goal. Making the product great is the goal.”
Government, in general, would do well to convert as many activities as it can from being processes to being projects, whose aim is to achieve clear and discrete results.
Along with many of us who have supported open innovation and citizen engagement, he points out that good ideas can come from anywhere inside or outside the organization:
“Do not discount ideas from unexpected sources. Inspiration can, and does, come from anywhere.”
And he adds that good managers don’t just look to employees for new solutions, but for help in an earlier stage – defining what the real problem is.
In government, you often hear the line that “information is power” and thus many leaders horde that information. Catmull, on the contrary, argues for the need for open communication:
“If there is more truth in the hallways than in meetings, you have a problem. Many managers feel that if they are not notified about problems before others are or if they are surprised in a meeting, then that is a sign of disrespect. Get over it.”
Of course, actually having good communications isn’t any easier in government than it is anywhere else. Catmull suggests that it is the top leaders who have to make the major effort for good communications to occur and it is in their own interest. How many times have you been blindsided by something that others knew was a problem, but didn’t reach you until it was a full-fledged crisis?
“There are many valid reasons why people aren’t candid with one another in a work environment. Your job is to search for those reasons and then address them. … As a manager, you must coax ideas out of your staff and constantly push them to contribute.”
This brief review doesn’t do justice to the depth of the book. And I’m sure that many public officials could draw more parallels than I have.
Clearly the government would run better, the public would be better served and public officials would be more successful if creativity ruled in the public sector as well as it has at Pixar.
Thetitle may put you off. You may be
thinking that we’ve all been reading that the NSA has been doing plenty of
listening.
But
are there other uses for listening by the government? – not necessarily for
national security or to listen in to personal phone calls?
As
social media on the Internet have developed over the last several years,
companies have found a gold mine of information that can help them better
understand their customers’ views, needs and moods – and to better assess the
value (or lack of value) of the products and services they offer.
Often
called sentiment analysis,
this has been applied in a variety of ways.
(Note: in a blog post I can only touch upon what is a large, developing
and growing topic, so if this intrigues you, use this piece as just a starting
point.)
Investment
firms use sentiment analysis to determine the future direction of corporate
securities. See, for example, Stock Sonar. Sentiment140
is a company that measures sentiment about product and brands.
CrowdFlower uses five million people in
its sentiment projects. Earlier this
week, the company released its Data For Everyone Library
which makes that information available on topics as different as immigration, Coachella
2015, wearable technology and sports, among others.
Software to analyze sentiment, either with
or without human assistance, is a major focus of research in various
universities and tech businesses. One of
many examples that you can try for free is at http://www.danielsoper.com/sentimentanalysis/default.aspx .
The Kapsik project is another example. As an illustration, their website shows the
trend in sentiment about London Mayor Boris
Johnson.
I’d
expect that there would be many elected and other public officials who might
want to check their daily sentiment index.
Now they can do it.
But
it’s not just about satisfying the egos or re-election needs of individuals.
Officials
can use sentiment analysis of publicly available tweets, posts, etc. on the Internet
as an important addition to their toolkit of ways to understand what is working
and not working for their constituents.
It can also be a way to discover issues that are bubbling up, but haven’t
yet reached the stage where they explode in the faces of officials.
“The airline launches a Listening Center to centralize social, industry, and operational data. … [It] is an internal resource that combines social conversations, industry news, and operational data into one central hub… To monitor what customers are saying about the brand, industry, or a specific topic, Southwest uses a keyword-based listening tool that pulls in mentions from social platforms like Twitter. As for staying on top of its operational information, like departures and arrivals, Southwest has a satellite Listening Center inside of its Network Operations Control center (NOC). This real-time insight allows the airline to identify issues and engagement opportunities quickly … and then react accordingly via the channels that customers are using.”
Government could combine into one hub its own operational data, news about other important events going on in the public and private sectors and what citizens are saying in public forums. Now that’s a listening post that would provide clear positive benefits for everyone – without any scary Big Brother controversies.
The National Association of Counties just concluded its annual mid-winter Legislative Conference in Washington, DC. I was there in my role as NACo’s first Senior Fellow.
As usual, its Chief Innovation Officer, Dr. Bert Jarreau, created a three-day extravaganza devoted to technology and innovation in local government.
The first day was a CIO Forum, the second day NACo’s Technology Innovation Summit and the final day a variety of NACO committees on IT, GIS, etc.
County government – especially the best ones – get too little recognition for their willingness to innovate, so I hope this post will provide some information about what county technologists and officials are discussing.
One main focus of the meetings was on government’s approach to technology and how it can be improved.
Jen Pahlka, founder and Executive Director of Code For America and former Deputy Chief Technology Officer in the White House, made the keynote presentations at both the CIO Forum on Friday and the Tech Summit on Saturday – and she was a hit in both.
She presented CfA’s seven “Principles for 21st Century Government”. The very first principle is that user experience comes before anything else. The use of technology is not, contrary to some internal views, about “solving” some problem that the government staff perceive.
She pointed out that the traditional lawyer-driven design of government services actually costs more than user-centric design. (I’ll have more on design in government in a future blog post.)
She referred to the approach taken by the United Kingdom’s Digital Service. For more about them, see https://gds.blog.gov.uk/about/ When she was in the White House, she took this as a model and helped create a US Digital Service.
She also discussed the importance of agile software development. She suggested that governments break up their big RFPs into several pieces so that smaller, more entrepreneurial and innovative firms can bid. This perhaps requires a bit more work on the part of government agencies, but they would be rewarded with lower costs and quicker results.
More generally she drew a distinction between the traditional approach that assumes all the answers – all the requirements for a computer system – are known ahead of time and an agile approach that encourages learning during the course of developing the software and changing the way an agency operates.
By way of example, she discussed why the Obamacare website failed. It used the traditional, waterfall method, not an agile, iterative approach. It didn’t involve real users testing and providing feedback on the website. And, despite the common wisdom to the contrary, the development project was too big and over-planned.
It was done in a way that was supposed to reduce risk, but instead was more risky. So she asked the NACo members to redefine risk, noting that yesterday’s risky approach is perhaps today’s prudent approach.
Helping along is the development of cloud computing. So Oakland County (Michigan) CIO Phil Bertolini has found that cloud computing is reducing government’s past dependence on big capital projects to deploy new technology, thus allowing for more day-to-day agility.
Finally Jen Pahlka suggested that government systems needed to be more open to integration with other systems. In a phrase, “share everything possible to share”. She showed an example where the government let Yelp use government restaurant inspection data and in turn learn about food problems from Yelp users. (And, of course, sharing includes not just data, but also software and analytics.)
In another illustration of open innovation in the public sector, Montgomery County, MD recently created its Thingstitute as an innovation laboratory where public services can be used as a test bed for the Internet of Things. Even more examples were discussed in the IT Committee. Maricopa County, Arizona and Johnson County, Kansas, both now offer shared technology services to cities and nearby smaller counties. Rita Reynolds, CIO of the Pennsylvania County Commissioners Association, discussed the benefits of adopting the NIEM approach to data exchanges between governments.
The second major focus of these three days was cybersecurity.
Dr. Alan Shark, Executive Director of PTI, started off by revealing that latest surveys show security is the top concern for local government CIOs for the first time. Unfortunately, many don’t have the resources to react to the threat. Actually, it’s more a reality than merely a threat. It was noted that, on average, it takes 229 days for organizations to find out they’ve been breached and that close to 100% have been attacked or hacked in some way. It’s obviously prudent to assume yours too has been hacked.
Jim Routh, Chief Information Security Officer (CISO) of Aetna insurance recommended a more innovative approach to responding to cybersecurity threats. He said CIOs should ignore traditional advice to try to reduce risk. Instead “take risks to manage risk”. (This was an interesting, if unintentional, echo of Jen Pahlka’s comments about software development.)
Along those lines, he said it is better to buy less mature cybersecurity products, in addition to or even instead of the well-known products. The reason is that the newer products address new issues better in an ever changing world and cost less.
There was a lot more, but these highlights provide plenty of evidence that at least the folks at NACo’s meetings are dealing with serious and important issues in a creative way.