Interesting Books In 2020

There have been a lot of things we haven’t been able to do during the last nine months. But it’s been a good time for reading ebooks and listening to audiobooks. So my on-again-off-again tradition of highlighting interesting books that I have read in the year is on again.

These books have not all been published during the last year, but are ones I’ve read this past year and thought worth mentioning to other folks who read this blog.  You’ll note that this is an eclectic combination of books on technology, government, the economy and other non-fiction – but that’s the range of topics that my blog is about.

Anyway, here’s my list for 2020 and a blurb as to why each book is on the list.  I have obviously eliminated from the list the many other books that I’ve read, which I would not recommend you spend your time on. 😊

Technology, AI/Machine Learning and Science

  1. David Carmona – The AI Organization: Learn from Real Companies and Microsoft’s Journey How to Redefine Your Organization with AI (2019). Perhaps too many examples from Microsoft, but it is a really good book from A to Z on artificial intelligence.
  2. Cliff Kuang and Robert Fabricant – User Friendly: How the Hidden Rules of Design Are Changing the Way We Live, Work, and Play (2019). Very interesting review of the leading good (and sometimes bad) user interfaces.
  3. Matthew O. Jackson – The Human Network: How Your Social Position Determines Your Power, Beliefs, and Behaviors (2019). Good, understandable explanations of network measures and phenomena in various domains.
  4. Damon Centola – How Behavior Spreads: The Science of Complex Contagions (2018). Provides a nuanced view of the best time to use weak or strong ties, especially in leading changes in an organization or community.
  5. Eric Topol – Deep Medicine: How Artificial Intelligence Can Make Healthcare Human Again (2019). Although it is mostly about the ways that artificial intelligence can re-humanize the patient-doctor relationship, it even has a pretty good, understandable review of general artificial intelligence and machine learning concepts.
  6. Lisa Feldman Barrett – How Emotions Are Made: The Secret Life of the Brain (2017). The title highlights emotions, but this book is not just about emotions. It instead offers a paradigm shift about how the brain works.
  7. Jodie Archer and Matthew L. Jockers – The Bestseller Code: Anatomy of a Blockbuster Novel (2016). interesting book, better and more nuanced than the usual summaries about machine learning models to predict the success of books.
  8. Leonard Mlodinow – The Drunkard’s Walk: How Randomness Rules Our Lives (2009). Interesting explanations of the implications of probability theory and how most people get probability wrong.
  9. Scott Rigby and Richard M Ryan – Glued to games: how video games draw us in and hold us spellbound (2011). Good review of computer-based games, especially the psychological aspects.

Leadership And Business

  1. Jim McKelvey – The innovation stack: building an unbeatable business one crazy idea at a time (2020). Good, insightful and sometimes funny book by one of the co-founders of Square, with the proposition that success is the result of a chain (better word than stack) of innovations rather than just one big one.
  2. Scott Kupor – Secrets of Sand Hill Road: Venture Capital and How to Get It (2019). If you want to know how venture capitalists look at startups, this tells you how.
  3. Geoffrey G. Parker, Marshall W. Van Alstyne, Sangeet Paul Choudary – Platform Revolution: How Networked Markets Are Transforming the Economy – and How to Make Them Work for You (2017). While other books on the subject go more deeply into the broader policy implications of platforms, if you want to start a platform business, this is your best, almost required, user manual.
  4. Daniel Coyle – The Culture Code: The Secrets of Highly Successful Groups (2018). Culture is a frequently used word to explain the forces that drive behavior in organizations, but too often the concept is fuzzy. This book is one of the clearest and best on the subject.
  5. Dan Heath – Upstream: The Quest to Solve Problems Before They Happen (2020). Good, as usual for the Heath brothers, well written down to earth, but important concepts underneath and guidance at looking at the more fundamental part of problems that you are trying to solve.
  6. Matt Ridley – How Innovation Works: And Why It Flourishes in Freedom (2020). Includes many short histories of key innovations, not just invention, with an emphasis on the iterative and collaborative nature of the innovation process. Ridley advocates curtailing IP protections, thus providing more tolerance of risky experiments/innovations.
  7. Rita McGrath – Seeing Around Corners: How To Spot Inflection Points In Business Before They Happen (2019). Columbia Professor McGrath has made clear that no strategy is sustainable for a long time and in this book, she helps you figure out when you are at good or bad inflection points.

The Economy And Government

  1. Robert H. Frank – Under the Influence: Putting Peer Pressure to Work (2020). Frank is one of the most creative economists around and in this review of behavioral economics, he highlights how people pursue relative positions of wealth, rather than merely being rational maximizers of wealth.  He also offers a good discussion of public policies to pursue, that are based on this understanding of economic behavior.
  2. Stephanie Kelton – The Deficit Myth: Modern Monetary Theory and the Birth of the People’s Economy (2020). Well written, clear exposition of modern monetary theory and the positive and negative consequences of having completely fiat money (no gold standard or fixed currency exchanges). Professor Kelton is an increasingly influential economist and her ideas – whether or not she is given credit – have enabled the US Government to spend more with less angst than used to be the case.
  3. Abhijit V. Banerjee and Esther Duflo – Good Economics for Hard Times: Better Answers to Our Biggest Problems (2019). A review of economics research – and, more important, its limits – in addressing major socio-economic problems.
  4. Matthew Yglesias – One Billion Americans: The Case for Thinking Bigger (2020). Although no one (including me) will agree with everything he proposes, this is an interesting book with some original forward thinking – something we need more of as we face a very changed future.
  5. Michael Hallsworth and Elspeth Kirkman – Behavioral Insights (2020). This is a good overview of the application of behavior research to mostly public policy, especially about the UK.
  6. Paul Begala – You’re fired: the perfect guide to beating Donald Trump (2020). Smart and realistic proposals for the campaign to oppose Trump with many very funny lines.
  7. Jane Kleeb – Harvest the Vote: How Democrats Can Win Again in Rural America (2020). Along with Begala, explains her own success in rural America and more generally what needs to be done by Democrats to regain their old reputation as the party of the majority of people.
  8. Mark Lilla – The Once and Future Liberal: After Identity Politics (2017). Short review of how the Democratic party became dominated by identity politics and, for that reason, provides a bit of background for the previous too books.

Have a happy holiday season and a great, much better, year in 2021!

© 2020 Norman Jacknis, All Rights Reserved

The Limits To Being Different

Product differentiation is often described as the key to business success. Companies are told that unless they really stand out from the crowd, their products or services will become “commoditized” — an undesirable position in the marketplace that results in little or no profit. This has been well-established guideline in the world of technology startups and even new technology-based product development in existing companies.

And that guidance is mostly right. Distinguishing your products from the crowd of competitors often results in greater than average profits. Consider Apple, with less market share than Android, but lots more profit than its smart phone competitors.

Of course, how to go about this is not so simple. One of the best and most inspiring books about how to differentiate — how to be really different — is Harvard Business School Professor Youngme Moon’s book, Different: Escaping the Competitive Herd — Standing Out In A World Where Conformity Reigns But Exceptions Rule.

These quotes summarize her forceful advice:

“What does it mean to be really different? Different in a way that makes a difference. It could mean doing the opposite of what everyone else is doing — going small when everyone else is going big…

“You could even say that breakaway brands revel in our stereotypes, since they make their living turning them upside down…

“These brands are the antithesis of well-behaved, and their mutiny is directed squarely at the category assumptions we bring to the table. And sometimes the transgression is more than a touch provocative; it’s a bit twisted as well. …

“What a breakaway positioning strategy offers is the opportunity to achieve a kind of differentiation that is sustainable over the long term. … it has no competitors; it remains sui generis.”

This advice applies not only to business, but can also apply to politics. That’s why I wrote a post four years ago called “The Breakaway Brand Of 2016” about the 2016 US Presidential election. Although I doubt that he read her book and his approach certainly didn’t please Professor Moon, Trump seemed to have been using it as his playbook for the 2016 election. His was the perfect exemplar of a breakaway brand in politics.

Now the 2020 Election also showed the limits of this approach. In a two-way election in the US, you need a majority (putting aside the Electoral College, for the moment).

 

It is also often the case that being different means you won’t get a majority, as both Apple and Trump have found out. For Apple, that’s not a problem. For Trump, it meant he lost the election.

While he did receive many votes, the limits of breaking too far away in politics was well stated by the most successful politician in American history, Franklin Roosevelt: “It is a terrible thing to look over your shoulder when you are trying to lead — and find no one there.”

The limits of extreme differentiation are clear enough in electoral contests. But the election result also reminded me that there are limits to being different in business too. I’m especially thinking of most established technology-based, multi-sided platform businesses (like Amazon) and other businesses that depend on direct network effects (like Facebook).

These businesses also need to have a majority (or even more) of the market. That’s because their value to customers depends a lot on network effects. Being too different for most people will mean you do not end up getting the majority of people as customers.

So, differentiating — even creating breakaway brands — is certainly good advice in general. But like any advice, it is not always appropriate. And the art of leadership is knowing when not to follow generally good advice and take a different road — even a different road about being different.

© 2020 Norman Jacknis, All Rights Reserved

Leveling The Playing Field?

This past week started the COVID-postponed Intelligent Community Forum’s Annual Summit – now virtual and continuing over two weeks.  As usual as Senior Fellow at ICF, I made a presentation yesterday and led a workshop on “Bringing Broadband To Your Community”.

I have previously reported on what is happening in cities this year. In the face of COVID-inspired video conferencing and the departure from offices and some previously popular cities, the question is raised again – can we level the playing field again between the biggest metropolises and elsewhere in the US that have not had broadband?

Many communities now recognize that they will be completely left out of a post-COVID economy.  They are hoping that some outside organization – a benevolent telecommunications company or some government agency – will come in and make the necessary investment so that their community has the broadband it needs.

Considering how many politicians have included broadband as a basic part of our infrastructure, it may be possible that at least the government will provide a lot of funding next year.  But it is worth noting that talk about the government investing on broadband is not new and not all that much has happened in the past.

So in my presentation at the ICF summit, I drew attention to some examples of communities that just went ahead and built this for themselves.  You may have already heard of Chattanooga, Tennessee and Lafayette, Louisiana, both of which deployed broadband through their electric utilities that are owned by the city government.

But here I want to give some credit to two examples that are not so well known.  The first is in a poorly served urban community in San Francisco.  The second is in a rural area that had expected to be the last to get broadband in England.

Although San Francisco bills itself as the high-tech capital of the world, the reality is that 100,000 of its residents (1 in 8) do not have a high-speed Internet connection at home.  This situation, by the way, is not unique to San Francisco.  Many otherwise well-connected cities have vast areas without affordable broadband – not quite Internet deserts, but with Internet effectively out of reach to low income residents for technical or financial reasons.

So in conjunction with an urban wireless Internet provider, Monkeybrains (great name!), the city government rolled out its Fiber to Housing initiative last year.  According to a report “Can San Francisco Finally Close its Digital Divide?” in November 2019, they had already free, high-speed internet to more than 1,500 low-income families in 13 housing communities – public housing.  By this past summer, the number was increased to 3,500 families.  While there is still a long way to go, the competition has already forced traditional Internet service providers to step up their game as well.

In a very different community in rural England, there is a related story, except this region, unlike San Francisco, is the last place you would expect to find broadband.  In the northwest corner of England, surrounding the not-so-big city of Lancaster (population around 50,000), a non-profit community benefit society was created to provide broadband for the rural north.  It is called B4RN.

As they proclaim on their website, they offer “The World’s Fastest Rural Broadband [with] Gigabit full fibre broadband costing households just £30/month”.  As of the middle of last year, they had more than 6,000 fully connected rural households.

In speaking with Barry Forde, CEO of B4RN, I learned a part of the story that should resonate with many others.  The community leaders who wanted to bring broadband to their area tried to explain to local farmers the process of building out a fiber network.  They noted that the technology costs of these networks are often dwarfed by the construction costs of digging in the ground to lay the fiber. The farmers then responded that digging holes was something they could do easily – they already had the equipment to dig holes for their farming!  With that repurposing of equipment, the project could move much more quickly and less expensively.

I can’t go into the whole story here, but this video gives a good summary of the vision and practical leadership that has made B4RN a success.

Frankly, if B4RN can do it, any community can do it.  Whether it’s in one of the most costly cities or in the remote countryside, a little creativity and community cooperation can make broadband possible.

And it need not be gigabit everywhere to start or having nothing at all.  Build what you can, get people to use it and the demand will grow to support upgrades.  An intelligent community grows step by step this way.

These were the important lessons of the ICF Summit yesterday.

© 2020 Norman Jacknis, All Rights Reserved

Thinking About Something New: Brain Twisting Is Unnecessary

If you have a new product or service in mind, you know that you need to find a way to differentiate it from the alternatives that people are already using or could use.  But then maybe you have a hard time coming up with ways to make what you are offering really different and new.

This is a basically a challenge to your creativity. And many of us think we need to twist our brains to come up with good creative ideas, which is hard work we don’t feel we can do.

Although we have come to frequently expect new technology products, the challenge of creativity is especially hard for technologists.  They have lived in a world that demands no software bugs, no downtime and the like.  They are by training (as the A students many were in school) and maybe by nature perfectionists.

A perfectionist mindset undermines the kind of experimental approach and its possibility of failure which is necessary for innovation.  For that reason, creativity can seem to be an insurmountable, impossible challenge – to be both perfect and creative is a low probability occurrence.

Coming up with new ideas shouldn’t be such a challenge.  Consider just two of many authors.  Tina Seelig, Professor of Practice at Stanford, has written and spoken about creativity and innovation.  The titles of two of her books offer a quick summary of her themes — “InsightOut” Get Ideas Out Of Your Head and Into the World” and “inGenius: A Crash Course on Creativity” .

 

William Duggan of Columbia Business School has also written “Creative Strategy: A Handbook for Innovation” in which he champions the innovation matrix as a means of generating new ways of looking at the world. You break down what you’re trying to do into its parts and then search for any company that provides a model of how to do that part well. It’s a tool for what’s called recombinant innovation.

In addition to books on creativity, however, consider a methodology for analysis and software design from more than forty years ago that was named after its originators – Yourdon and DeMarco.  If it is remembered at all, it is for data flow diagrams.

 

That’s not what I want to emphasize here. Nor do I plan to lead an effort to revive the popularity of Yourdon-DeMarco structured analysis/design and the classic waterfall development lifecycle that it aimed to improve.  Nor am I advocating for the underlying idea that there could be a complete and correct design up front in that lifecycle.

Yourdon and DeMarco had even more important guidance for software designers, although that seems to have been lost in the history of software design.

That guidance:  think more conceptually, more abstractly.  They distinguished between the logical level (the “what”) and the physical level (the “how”).   At the physical level, you would describe the implementation.  At the logical level, traditionally, you would describe essentially what the organization is trying to do.  When thinking about a problem, separate out its implementation (how you see it operate) from its intention.

When it comes time to re-design a system or designing a new product, you first rearrange what is happening at the logical level.  Only after that makes sense to everyone do you worry about how it will be implemented.

By the way, this is not something that requires an excessive amount of writing upfront.  Instead, it is often better to explain this to someone else verbally.  Because you are trying to communicate clearly and concisely in conversation rather than impress someone with a document.

Look at what is happening and describe it in simple words, before you use a fancy name for it that you might have been taught.  Often the solution to a problem is obvious if you listen to yourself carefully.  (Maybe recording it helps.)  That’s what you should start with.

Thinking this way makes things clear and clarity yields insight. Sometimes the solution can be blindingly simple once you look at things conceptually. The ancient story of Alexander the Great and the Gordon knot is a good example. The knot only had to be broken. Instead of meticulously searching where to pull on it so it would unravel, he just cut it.

One often cited example of the reverse approach and of missed opportunities that result is in the transportation industry.  When airplanes and airlines first appeared, there was an opportunity for the railroads to invest and own the new industry.  Instead of thinking of themselves as the movers of people and goods over long distances (the higher conceptual level), they thought of themselves as the operators of railroads (the lower physical level).  As they say, the rest is history.

You don’t need to twist your brain to arrive at innovative solutions.  Actually, conventional thinking often requires more brain twisting than creative thinking.  Using the approaches that I’ve outlined here require less, not more, brain twisting to be creative.

© 2020 Norman Jacknis, All Rights Reserved

Digging Deeper Into Why There Is A Problem

Almost every pitch deck for a startup (or even a new corporate-funded initiative) starts with a customer problem. In some form or other, the entrepreneur/intrapreneur says: “Here is a customer problem. The customer’s problem is an opportunity for us because we know how to solve that problem.” And then they go on to ask for the money they need to bring their solution to life.

Having been on the receiving end of these pitches many times, I have often thought that the presenter too quickly jumped on the first problem they saw and it was not the real problem the potential customer had. So if they tried to fix the superficial problem, the entrepreneur/intrapreneur would not get the market traction they hoped for – and it wouldn’t be worth it for us to invest in an idea with no traction.

That’s why in my last post I reviewed the key points in Dan Heath’s book “Upstream: The Quest To Solve Problems Before They Happen”.  In a nutshell, his message is that you have to go upstream beyond the first problem (downstream) you see and find the root cause of that problem.

An example of thinking about a root cause can be found in the 500-year-old poem that is supposed to have been about the English King Richard III’s loss in 1485 at the Battle of Bosworth Field to Henry Tudor who then became king:

For want of a nail the shoe was lost. For want of a shoe the horse was lost. For want of a horse the rider was lost. For want of a rider the message was lost. For want of a message the battle was lost. For want of a battle the kingdom was lost. And all for the want of a horseshoe nail.

It isn’t always easy to figure out where upstream the problem is.  In post-mortems on fatal catastrophes, root cause analysis often starts with the Five Whys technique.

But you do not need a catastrophic failure to motivate you to use this method.  Anytime you want to understand better the problems that customers or constituents are facing, you can use the method.

It is quite easy to explain, although much harder for most people to do.  Here is a simple example.

Five Whys is especially useful in thinking about any new product or service you hope to bring into the world.  If you identify the root cause of the problem, you’ll be able to come up with the right solution.  If you identify a solution for the superficial complaint a customer has, you may well end up doing the right thing about the wrong thing.

A famous quote attributed to Henry Ford identifies how you can go astray: “If I had asked people what they wanted, they would have said faster horses.”  There were several root causes of the problem that annoyed Ford’s customers, none of which could have been fixed by getting horses to go faster.

As you can see from the 5 Whys picture of a restaurant’s problem, people often think about causes in a linear fashion.  Event A causes Event B, which causes Event C, etc.  So all you need to do is go back from where you started, say Event C.  This is sometimes called Event-Oriented thinking.

But life is more complicated than that.  In his book, eventually Dan Heath introduces the necessity of Systems Thinking, since upstream you may well find not a linear series of causes, but a set of interrelated factors.   This picture nicely summarizes the difference.

You may recognize the feeling of being caught in a loop, being in a “Catch-22” situation where you go in circles.  Since Catch-22 was originally about absurdity in wars and not an everyday experience, perhaps this Dilbert cartoon provides a better simple example.

Properly assessing the forces and their mutual reinforcement – in other words, doing systems thinking – is even harder than struggling with the 5 Whys of a simple linear chain of causes.  But it is necessary to really understand the world you are operating in.

Again, especially for those devising new products or services, it is that understanding which will help you avoid significant, strategic business errors.

© 2020 Norman Jacknis, All Rights Reserved

Are You Looking At The Wrong Part Of The Problem?

In business, we are frequently told that to build a successful company we have to find an answer to the customer’s problem. In government, the equivalent guidance to public officials is to solve the problems faced by constituents. This is good guidance, as far as it goes, except that we need to know what the problem really is before we can solve it.

Before those of us who are results-oriented, problem solvers jump into action, we need to make sure that we are looking at the right part of the problem. And that’s what Dan Heath’s new book, “Upstream: The Quest To Solve Problems Before They Happen” is all about.

Heath, along with his brother Chip, has brought us such useful books as “Made To Stick: Why Some Ideas Survive and Others Die” and “Switch: How to Change Things When Change Is Hard”.

As usual for a Heath book, it is well written and down to earth, but contains important concepts and research underneath the accessible writing.

He starts with a horrendous, if memorable, story about kids:

You and a friend are having a picnic by the side of a river. Suddenly you hear a shout from the direction of the water — a child is drowning. Without thinking, you both dive in, grab the child, and swim to shore. Before you can recover, you hear another child cry for help. You and your friend jump back in the river to rescue her as well. Then another struggling child drifts into sight…and another…and another. The two of you can barely keep up. Suddenly, you see your friend wading out of the water, seeming to leave you alone. “Where are you going?” you demand. Your friend answers, “I’m going upstream to tackle the guy who’s throwing all these kids in the water.”

 

Going upstream is necessary to solve the problem at its origin — hence the name of the book. The examples in the book range from important public, governmental problems to the problems of mid-sized businesses. While the most dramatic examples are about saving lives, the book is also useful for the less dramatic situations in business.

Heath’s theme is strongly, but politely, stated:

“So often we find ourselves reacting to problems, putting out fires, dealing with emergencies. We should shift our attention to preventing them.”

This reminds me of a less delicate reaction to this advice: “When you’re up to your waist in alligators, it’s hard to find time to drain the swamp”. And I often told my staff that unless you took some time to start draining the swamp, you are always going to be up to your waist in alligators.”

He elaborates and then asks a big question:

We put out fires. We deal with emergencies. We stay downstream, handling one problem after another, but we never make our way upstream to fix the systems that caused the problems. Firefighters extinguish flames in burning buildings, doctors treat patients with chronic illnesses, and call-center reps address customer complaints. But many fires, chronic illnesses, and customer complaints are preventable. So why do our efforts skew so heavily toward reaction rather than prevention?

His answer is that, in part, organizations have been designed to react — what I called some time ago the “inbox-outbox” view of a job. Get a problem, solve it, and then move to the next problem in the inbox.

Heath identifies three causes that lead people to focus downstream, not upstream where the real problem is.

  • Problem Blindness — “I don’t see the problem.”
  • A Lack of Ownership — “The problem isn’t mine to fix.”
  • Tunneling — “I can’t deal with the problem right now.”

In turn, these three primary causes lead to and are reinforced by a fatalistic attitude that bad things will happen and there is nothing you can do about that.

Ironically, success in fixing a problem downstream is often a mark of heroic achievement. Perhaps for that reason, people will jump in to own the emergency downstream, but there are fewer owners of the problem upstream.

…reactive efforts succeed when problems happen and they’re fixed. Preventive efforts succeed when nothing happens. Those who prevent problems get less recognition than those who “save the day” when the problem explodes in everyone’s faces.

Consider the all too common current retrospective on the Y2K problem. Since the problem didn’t turn out to be the disaster it could have been at the turn of the year 2000, some people have decided it wasn’t real after all. It was, but the issue was dealt with upstream by massive correction and replacement of out-of-date software.

Heath realizes that it is not simple for a leader with an upstream orientation to solve the problem there, rather than wait for the disaster downstream.

He asks leaders to first think about seven questions, which explores through many cases:

  • How will you get early warning of the problem?
  • How will you unite the right people to assess and solve the problem?
  • Where can you find a point of leverage?
  • Who will pay for what does not happen?
  • How will you change the system?
  • How will you know you’re succeeding?
  • How will you avoid doing harm?

Some of these questions and an understanding of what the upstream problem really is can start to be answered by the intelligent use of analytics. That too only complicates the issue for leaders, since an instinctive heroic reaction is much sexier than contemplating machine learning models and sexy usually beats out wisdom 🙂

Eventually Heath makes the argument that not only do we often focus on the wrong end of the problem, but that we think about the problem too simplistically. At that point in his argument, he introduces the necessity of systems thinking because, especially upstream, you may find a set of interrelated factors and not a simple one-way stream.

[To be continued in the next post.]

© 2020 Norman Jacknis, All Rights Reserved

Technology and Trust

A couple of weeks ago, along with the Intelligent Community Forum (ICF) co-founder, Robert Bell, I had the opportunity to be in a two-day discussion with the leaders of Tallinn, Estonia — via Zoom, of course. As part of ICF’s annual selection process for the most intelligent community of the year, the focus was on how and why they became an intelligent community.

They are doing many interesting things with technology both for e-government as well as more generally for the quality of life of their residents. One of their accomplishments, in particular, has laid the foundation for a few others — the strong digital identities (and associated digital signatures) that the Estonian government provides to their citizens. Among other things, this enables paperless city government transactions and interactions, online elections, COVID contact warnings along with protection/tracking of the use of personal data.

Most of the rest of the world, including the US, does not have strong, government-issued digital identities. The substitutes for that don’t come close — showing a driver’s license at a store in the US or using some third party logon.

Digital identities have also enabled an E-Residency program for non-Estonians, now used by more than 70,000 people around the world.

As they describe it, in this “new digital nation … E-Residency enables digital entrepreneurs to start and manage an EU-based company online … [with] a government-issued digital identity and status that provides access to Estonia’s transparent digital business environment”

This has also encouraged local economic growth because, as they say, “E-Residency allows digital entrepreneurs to manage business from anywhere, entirely online … to choose from a variety of trusted service providers that offer easy solutions for remote business administration.” The Tallinn city leaders also attribute the strength of a local innovation and startup ecosystem to this gathering of talent from around the world.

All this would be a great story, unusual in practice, although not unheard of in discussions among technologists — including this one. As impressive as that is, it was not what stood out most strongly in the discussion which was Tallinn’s unconventional perspective on the important issue of trust.

Trust among people is a well-known foundation for society and government in general. It is also essential for those who wish to lead change, especially the kind of changes that result from the innovations we are creating in this century.

I often hear various solutions to the problem of establishing trust through the use of better technology — in other words, the belief that technology can build trust.

In Tallinn’s successful experience with technology, cause-and-effect go more in the opposite direction. In Tallinn, successful technology is built on trust among people that had existed and is continually maintained regardless of technology.

While well-thought out good technology can also enhance trust to an extent, in Tallinn, trust comes first.

This is an important lesson to keep in mind for technologists who are going about changing the world and for government leaders who look on technology as some kind of magic wand.

More than once in our discussions, Tallinn’s leaders restated an old idea that preceded the birth of computers: few things are harder to earn and easier to lose than trust.

© 2020 Norman Jacknis, All Rights Reserved

Bitcoin & The New Freedom Of Monetary Policy

Every developing technology has the potential for unintended consequences.  Blockchain technology is an example.  Although there are many possible uses of blockchain as a generally trusted and useful distributed approach to storing data, its most visible application has been virtual or crypto-currencies, such as Bitcoin, Ethereum and Litecoin. These once-obscure crypto-currencies are on a collision course with another trend that in its own way is based on technology — mostly digital government-issued money.

Although there are many possible uses of blockchain as a generally trusted and useful distributed approach to storing data, its most visible application has been virtual or crypto-currencies, such as Bitcoin, Ethereum and Litecoin. These once-obscure crypto-currencies are on a collision course with another trend that in its own way is based on technology — mostly digital government-issued money.

In particular, another once-obscure idea about government money is also moving more into the mainstream — modern monetary theory (MMT), which I mentioned few weeks ago in my reference to Stephanie Kelton’s new book, “The Deficit Myth”. In doing a bit of follow up on the subject, I came across many articles that were critical of MMT. Some were from mainstream economists. Many more were from advocates of crypto-currencies, especially Bitcoiners.

Although I doubt that Professor Kelton would agree, many Bitcoiners feel that governments have been using MMT since the 1970s — merely printing money. They forget about the tax and policy stances that Kelton advocates.

Moreover, there is a significant difference in the attitude of public leaders when they think they are printing money versus borrowing it from large, powerful financial interests. James Carville, chief political strategist and guru for President Clinton famously said, “I used to think that if there was reincarnation, I wanted to come back as the president or the pope or as a .400 baseball hitter. But now I would like to come back as the bond market. You can intimidate everybody.”

For Bitcoiners, the battle is drawn and they do not like MMT. Here is just a sample of the headlines from the last year or so:

It is worth noting that MMT raises very challenging issues of governance. Who decides how much currency to issue? Who decides when there is too much currency? Who decides what government-issued money is spent on and to whom it goes? This is especially relevant in the US, where the central bank, the Federal Reserve, is at least in theory independent from elected leaders.

However, it also gives the government what may be a necessary tool to keep the economy moving during recessions, especially major downturns. Would a future dominated by cryptocurrencies, like Bitcoin, essentially tie the hands of the government in the face of an economic crisis? — just as the gold standard did during the Panic of 1893 and the Great Depression (until President Roosevelt suspended the convertibility of dollars into gold)?

This picture shows MMT as a faucet controlling the flow of money as the needs of the economy changes. If this were a picture of Bitcoin’s role, the faucet would be almost frozen, dripping a relatively fixed amount that is dependent upon Bitcoin mining.

Less often discussed is that cryptocurrencies, as a practical matter, also end up needing some governance. I am not going to get into the weeds on this, but you can start with “In Defense of Szabo’s Law, For a (Mostly) Non-Legal Crypto System”. The implication is that cryptocurrencies need some kind of rules and laws enforced by some people. Sounds like at least a little bit of government to me.

Putting that aside, if Bitcoin and/or other cryptocurrencies succeed in getting widespread adoption, then it would seem that they would limit the ability of governments to encourage or discourage economic growth through the issuance of money.

Of course, some officials do not seem to worry too much. This attitude is summed up in a European Parliament report, published in 2018.

Decentralised ledger technology has enabled cryptocurrencies to become a new form of money that is privately-issued, digital and that permits peer-to-peer transactions. However, the current volume of transactions in such cryptocurrencies is still too small to make them serious contenders to replace official currencies. 

Underlying this are two factors. First, cryptocurrencies do not perform the role of money well, because their value is very volatile and they are thus not very good stores of value. Second, cryptocurrencies are managed in ways that are very primitive compared to what modern currencies require.

These shortcomings might be corrected in the future to increase the popularity and reach of cryptocurrencies. However, those that manage currencies, in other words monetary policymakers, cannot be outside any societal system of checks and balances.

For cryptocurrencies to replace official money, they would have to conform to the institutional set up that monitors and evaluates those who have the power to manage money.

They do not seem to be too worried, do they? However, cryptocurrency might eventually derail the newfound freedom that government economic policy makers have realized they have through MMT.

As we have seen in the past, new technologies can suddenly grow very fast and blindside public officials. As Roy Amara, past president of The Institute for the Future, said, “We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run”.

© 2020 Norman Jacknis, All Rights Reserved

The Second Wave Of Capital

I have been doing research about the future impact of artificial intelligence on the economy and the rest of our lives. With that in mind, I have been reading a variety of books by economists, technologists, and others.That is why I recently read “Capital and Ideology” by Thomas Piketty, the well-known French economist and author of the best-selling (if not well read) “Capital in the Twenty-First Century”. It contains a multi-national history of inequality, why it happened and why it has continued, mostly uninterrupted.

At more than 1100 pages, it is a tour de force of economics, history, politics and sociology. In considerable detail, for every proposition, he provides reasonable data analyses, which is why the book is so long. While there is a lot of additional detail in the book, many of the themes are not new, in part because of Piketty’s previous work.  As with his last book, much of the commentary on the new book is about income and wealth inequality.  This is obviously an important problem, although not one that I will discuss directly here.

Instead, although much of the focus of the book is on capital in the traditional sense of money and ownership of things, it was his two main observations about education – what economists call human capital – that stood out for me. The impact of a second wave and a second kind of capital is two-fold.

  1. Education And The US Economy

From the mid-nineteenth century until about a hundred years later, the American population had twice the educational level of people in Europe. And this was exactly the same period that the American economy surpassed the economies of the leading European countries. During the last several decades, the American population has fallen behind in education and this is the same time that their incomes have stagnated.  It is obviously difficult to tease out the effect of one factor like education, but clearly there is a big hint in these trends.

As Piketty writes in Chapter 11:

The key point here is that America’s educational lead would continue through much of the twentieth century. In 1900–1910, when Europeans were just reaching the point of universal primary schooling, the United States was already well on the way to generalized secondary education. In fact, rates of secondary schooling, defined as the percentage of children ages 12–17 (boys and girls) attending secondary schools, reached 30 percent in 1920, 40–50 percent in the 1930s, and nearly 80 percent in the late 1950s and early 1960s. In other words, by the end of World War II, the United States had come close to universal secondary education.

At the same time, the secondary schooling rate was just 20–30 percent in the United Kingdom and France and 40 percent in Germany. In all three countries, it is not until the 1980s that one finds secondary schooling rates of 80 percent, which the United States had achieved in the early 1960s. In Japan, by contrast, the catch-up was more rapid: the secondary schooling rate attained 60 percent in the 1950s and climbed above 80 percent in the late 1960s and early 1970s.

In the second Industrial Revolution it became essential for growing numbers of workers to be able to read and write and participate in production processes that required basic scientific knowledge, the ability to understand technical manuals, and so on.

That is how, in the period 1880–1960—first the United States and then Germany and Japan, newcomers to the international scene—gradually took the lead over the United Kingdom and France in the new industrial sectors. In the late nineteenth and early twentieth centuries, the United Kingdom and France were too confident of their lead and their superior power to take the full measure of the new educational challenge.

How did the United States, which pioneered universal access to primary and secondary education and which, until the turn of the twentieth century, was significantly more egalitarian than Europe in terms of income and wealth distribution, become the most inegalitarian country in the developed world after 1980—to the point where the very foundations of its previous success are now in danger? We will discover that the country’s educational trajectory—most notably the fact that its entry into the era of higher education was accompanied by a particularly extreme form of educational stratification—played a central role in this change.

In any case, as recently as the 1950s inequality in the United States was close to or below what one found in a country like France, while its productivity (and therefore standard of living) was twice as high. By contrast, in the 2010s, the United States has become much more inegalitarian while its lead in productivity has totally disappeared.

  1. The Political Competition Between Two Elites

By now, most Americans who follow politics understand that the Democratic Party has become the favorite of the educated elite, in addition to the votes from minority groups. This coalition completely reverses what had been true of educated voters in most of the last century, who were reliable Republican voters. In the process, the Democratic Party has lost much of its working-class base.

The Republicans have been the party of the economic elite, although since the 1970s some of the working-class have joined in, especially those reacting to increased immigration and civil rights movements.

What Piketty points out is that, in this transition, working-class and lower income people have decreased their political participation, especially voting. He thinks that is because these voters felt that the Democratic Party has been taken over by the educational elite and no longer speaks for them.

What many Americans may not have realized is that this same phenomenon has happened in other economically advanced democracies, such as the UK and France. Over the longer run, Piketty wonders whether such an electoral competition between parties both dominated by elites can be sustained – or whether the voiceless will seek violence or other undemocratic outlets for their political frustrations.

In Chapter 14, he notes that, at the same time that the USA has lost the edge arising from a better educated population, it and other advanced economies that have now matched or surpassed the American educational level, have elevated education to a position of political power.

We come now to what is surely the most striking evolution in the long run; namely, the transformation of the party of workers into the party of the educated.

Before turning to explanations, it is important to emphasize that the reversal of the educational cleavage is a very general phenomenon. What is more, it is a complete reversal, visible at all levels of the educational hierarchy. we find exactly the same profile—the higher the level of education, the less likely the left-wing vote—in all elections in this period, in survey after survey, without exception, and regardless of the ambient political climate. Specifically, the 1956 profile is repeated in 1958, 1962, 1965, and 1967.

Not until the 1970s and 1980s does the shape of the profile begin to flatten and then gradually reverse. The new norm emerges with greater and greater clarity as we move into the 2000s and 2010s. With the end of Soviet communism and bipolar confrontations over private property, the expansion of educational opportunity, and the rise of the “Brahmin left,” the political-ideological landscape was totally transformed.

Within a few years the platforms of left-wing parties that had advocated nationalization (especially in the United Kingdom and France), much to the dismay of the self-employed, had disappeared without being replaced by any clear alternative.

A dual-elite system emerged, with on one side, a “Brahmin left,” which attracted the votes of the highly educated, and on the other side, a “merchant right,” which continued to win more support from both highly paid and wealthier votes.

This clearly provides some context for what we have been seeing in recent elections.  And although he is not the first to highlight this trend, the evidence that he marshals is impressive.

Considering how much there is in the book, it is not likely anyone, including me, would agree with all of the analysis. In addition to the analysis, Piketty goes on to propose various changes in taxation and laws, which I will discuss in the context of other writers in a later blog. For now, I would only add that other economists have come to some of the same suggestions as Piketty, although they have completed a very different journey from his.

For example, Daniel Susskind in The End Of Work is concerned that a large number of people will not be able to make a living through paid work because of artificial intelligence. The few who do get paid and those who own the robots and AI systems will become even richer at most everyone else becomes poorer. This blends with Piketty’s views and they end up in the same place – a basic citizen’s income and even a basic capital allotment to each citizen, taxation on wealth, estate taxes, and the like.

We will have much to explore about these and other policy issues arising from the byproducts of our technology revolution in this century.

© 2020 Norman Jacknis, All Rights Reserved

A Budget That Copes With Reality

Five years ago, I wrote about the possibility of dynamic budgeting.  I was reminded of this again recently after reading Stephanie Kelton’s eye-opening new book, “The Deficit Myth”.

Her argument is that, since the U.S. dropped the gold standard and fixed exchange rates, it can create as much money as it wants.  The limit is not an illusory national debt number, but inflation.  And in an economy with less than full employment, inflation is not now an issue.  Her explanation of the capacity of the Federal government to spend leads to her suggestions for a more flexible approach to dealing with major economic and social issues.

Although Dr. Kelton was the former staff director for the Democrats on the Senate Budget Committee, she doesn’t devote many words to the tools used in budgeting.  However, the argument that she makes reminds me again that the traditional budget itself has to change, especially shifting to a dynamic budget.

While states and localities are not in the same position as the Federal government, they also face unpredictable conditions and could benefit from a more flexible, dynamic budget.  Of course, in the face of COVID and economic retraction the necessity of re-allocating funds has become more obvious.

In an earlier blog, I wrote about a simple tax app that is now feasible and also eliminates the bumps in incentives that are caused by our current, old-fashioned tax bracket scheme.   This was not using some untested, cutting-edge technology.  Instead, the solution could use phones, tablets and laptops doing simple calculations that these devices have done for decades.

Similarly, what is now well-established technology could be used to overcome the problems with traditional fixed budgeting.  (By the way, the same applies to the budgets that corporations devise.)

So, what are the problems that everyone knows exist with budgets?

  1. They’re wrong the day they are approved since they are trying to predict precisely a future that cannot be known precisely ahead of time. This error is made worse by the early deadlines in the typical budget process.  If you run a department, you are likely to be asked by the budget office to prepare estimates for what you’ll need in a period that will go as far as 18 or even 24 months into the future.
  2. It’s not clear how the estimates are derived. Typically, there are no underlying rules or models, just the addition of personnel and other basic costs that are adjusted from the last year.  This is despite the fact that some things are fairly well known.  For example, it is fairly straightforward to estimate the cost of paying unemployment to an average individual.  What is harder is to figure out how many unemployed people there will be – and, of course, you need to know the total number of unemployed and the average cost in order to compute the total amount of money needed.
  3. Given these problems, in practice during any given budget year, all kinds of exceptions and deviations occur in the face of reality. But the rest of the budget is not readjusted, although the budget staff will often hold back money that was approved as it takes from “Peter to pay Paul”.  The process often seems and is very arbitrary.

Operating in the real world, of course, requires continual adjustments.  Such adjustments can best be accommodated if the traditional fixed budget was replaced by a dynamic budget at the start of the budget process.

One way of doing this is familiar to almost every reader of this blog – the spreadsheet.  The cells in spreadsheets don’t always have hard fixed numbers, like fixed budgets.  Instead many of those spreadsheets have formulas.

And Congress could also not so much the individual amounts for each agency or program, but their relative priorities under different scenarios.  Thus, in a recession there would be a need for more unemployment insurance funding, but that would recede in the face of other priorities if the economy is booming.

To go back to the unemployment example, the actual amount needed in the budget will change as we get closer to the month being estimated and can be more accurate in its estimates of the number of people who will be unemployed.

Of course, the reader who knows my background won’t be surprised that I think the formulas in these cells could be derived by the use of some smart analytics and machine learning.  Ultimately, these methods could be enhanced with simulations – after all, what is a budget but an attempt to simulate a future period of financial needs?

More on that in another post sometime in the future.

© 2020 Norman Jacknis, All Rights Reserved

Words Matter In Building Intelligent Communities

The Intelligent Community Forum (ICF) is an international group of city, town and regional leaders as well as scholars and other experts who are focused on quality of life for residents and intelligently responding to the challenges and opportunities provided by a world and an economy that is increasingly based on broadband and technology.

To quote from their website: “The Intelligent Community Forum is a global network of cities and regions with a think tank at its center.  Its mission is to help communities in the digital age find a new path to economic development and community growth – one that creates inclusive prosperity, tackles social challenges, and enriches quality of life.”

Since 1999, ICF has held an annual contest and announced an award to intelligent communities that go through an extensive investigation and comparison to see how well they are achieving these goals.  Of hundreds of applications, some are selected for an initial, more in-depth assessment and become semi-finalists in a group called the Smart21.

Then the Smart21 are culled to a smaller list of the Top7 most intelligent communities in the world each year.  There are rigorous quantitative evaluations conducted by an outside consultancy, field trips, a review by an independent panel of leading experts/academic researchers and a vote by a larger group of experts.

An especially important part of the selection of the Top7 from the Smart21 is an independent panel’s assessment of the projects and initiatives that justify a community’s claim to being intelligent.

It may not always be clear to communities what separates these seven most intelligent communities from the rest.  After all, these descriptions are just words.  We understand that words matter in political campaigns.  But words matter outside of politics in initiatives, big and small, that are part of governing.

Could the words that leaders use be part of what separates successful intelligent initiatives from those of others who are less successful in building intelligent communities?

In an attempt to answer that question, I obtained and analyzed the applications submitted over the last ten years.  Then, using the methods of analytics and machine learning that I teach at Columbia University, I sought to determine if there was a difference in how the leaders of the Top7 described what they were doing in comparison with those who did not make the cut.

Although at a superficial level, the descriptions seem somewhat similar, it turns out that the leaders of more successful intelligent community initiatives did, indeed, describe those initiatives differently from the leaders of less successful initiatives.

The first significant difference was that the descriptions of the Top7 had more to say about their initiatives, since apparently they had more accomplishments to discuss.  Their descriptions had less talk about future plans and more about past successes.

In describing the results of their initiatives so far, they used numbers more often, providing greater evidence of those results.  Even though they were discussing technology-based or otherwise sometimes complex projects, they used more informal, less dense and less bureaucratic language.

Among the topics they emphasized, engagement and leadership as well as the technology infrastructure primarily stood out.  Less important, but also a differentiation, the more successful leaders emphasized the smart city, innovation and economic growth benefits.

For those leaders who wish to know what will gain them recognition for real successes in transforming their jurisdictions into intelligent communities, the results would indicate these simple rules:

  • Have and highlight a solid technology infrastructure.
  • True success, however, comes from extensive civic engagement and frequently mentioning that engagement and the role of civic leadership in moving the community forward.
  • Less bureaucratic formality and more stress on results (quantitative measures of outcomes) in their public statements is also associated with greater success in these initiatives.

On the other hand, a laundry list of projects that are not tied to civic engagement and necessary technology, particularly if those projects have no real track record, is not the path to outstanding success – even if they check off the six wide-ranging factors that the ICF expects of intelligent communities.

While words do matter, it is also true that other factors can impact the success or failure of major public initiatives.  However, these too can be added into the models of success or failure, along with the results of the textual analytics.

Overall, the results of this analysis can help public officials understand a little better how they need to think about what they are doing and then properly describe it to their citizens and others outside of their community.  This will help them to be more successful, most importantly for their communities and, if they wish, as well in the ICF awards process.

© 2020 Norman Jacknis, All Rights Reserved

Working From Home Will Change Cities

Just three years ago, the New York Times had this headline Why Big Cities Thrive, and Smaller Ones Are Being Left Behind” – trumpeting the victory of big cities over their smaller competitors, not to mention the suburbs and rural areas.  At the top of that heap, of course, was New York City.

Now the headlines are different:

A week ago, the always perceptive Claire Cain Miller added another perspective in an Upshot article that was headlined with the question “Is the Five-Day Office Week Over?”  Her answer, in the sub-title, was that the “pandemic has shown employees and employers alike that there’s value in working from home — at least, some of the time.”

This chart summarizes a part of what she wrote about.  As Miller’s story makes quite clear, it is important to realize that some of what has happened during the COVID pandemic will continue after we have finally overcome it and people are free to resume activities anywhere.  Some of the current refugees from cities will likely move back to the cities and many city residents remained there, of course.  But the point is that many of these old, returning and new urban residents will have different patterns of work and that will require cities to change.

While the focus of this was mostly on remote office work, some observers note that cities still have lots of workers who do not work in offices.  While clearly there are numerous jobs that require the laying of hands on something or someone, there are also blue-collar jobs that do not strictly require a physical presence.

I have seen factories that can be remotely controlled, even before the pandemic.  Now this option is getting even more attention.  One of the technology trade magazines, recently (7/3/2020) had a storied with this headline – “Remote factories: The next frontier of remote work.”  In another example, GE has been offering technology solutions to enable the employees of utility companies to remotely control facilities – see “Remote Control: Utilities and Manufacturers Turn to Automation Software To Operate From Home During Outbreak”.

So perhaps the first blush of victory of big cities, like the British occupation of New York City during the American Revolution or the invasion of France in World War II, did not indicate how the war would end.  Perhaps the war has not ended because, in an internet age where many people can work from home, home does not have to be in big cities, after all, or if it is in a big city it does not have to be in a gleaming office tower.

These trends and the potential of the internet and technology to disrupt traditional urban patters, of course, have been clear for more than ten years.  But few mayors and other urban leaders paid attention.  After all they were in a recent period in which they could just ride the wave of what seemed to be ever increasing density and growth in cities – especially propelled by young people seeking office jobs in their cities.  This was a wonderful dream, combining the urban heft of the industrial age with cleaner occupations.

Now the possibility of a different world is hitting them in the face.  It is not merely a switch from factory to office employment, but a change from industrial era work patterns too.  Among other things that change means that people do not all have to show up in the same place at the same time.  This change requires city leaders to start thinking about all the various ways that they need to adjust their traditional thinking.

Here are just three of the ways that cities will be impacted by an increasing percentage of work being done at home:

  • Property taxes in most cities usually have higher rates on commercial property than on residential property. Indeed, commercial real estate has been the goose that has laid the golden eggs for those cities which have had flourishing downtowns.  But if the amount of square footage in commercial property decreases, the value of those properties and hence the taxes will go down.  On the other hand, most elected officials are loath to raise taxes on residential real estate, even if those residences are now generating income through commercial activities – a job at home most of the week.
  • Traffic and transit patterns used to be quite predictable. There was rush hour in the morning and afternoon when everyone was trying to get the same densely packed core.  With fewer people coming to the office every day that will change.  Even those who meet in downtown may not be going there now for the 9:00 AM start of the work day, but for a lunch meeting.  Then there is the matter of increasing and relatively small deliveries to homes, rather than large deliveries to stores in the central business district.  This too turns upside down the traditional patterns.
  • Excitement and enticement have, of course, been traditional advantages of cities. Downtown is where the action is.  Even that is changing.  Although it is still fun to go to Broadway, for example, I suspect that most people had a better view of the actors in the Disney Plus presentation of Hamilton than did those who paid a lot more money to sit somewhere many rows back even in the orchestra section of the theater.  At some point, people will balance this out.  So, cities are going to have be a lot more creative and find new ways, new magic to bring people to their core.

Cities have evolved before.  In the 18th century, American cities thrived on the traffic going through their ports.  While the ports still played a role, in later centuries, cities grew dramatically and thrived on their factories and industrial might.  Then they replaced factories with offices.

A transition to an as yet unclear future version of cities can be done and will be done successfully by those city leaders who don’t deny what is happening, but instead respond with a new vision – or at least new experimentation that they can learn from.

© 2020 Norman Jacknis, All Rights Reserved

Is It 1832 Or 2020? Virtual Convention Or Something New?

In these blogs, I’ve often noted how people seem wedded to old ways of thinking, even when those old ways are dressed up in new clothes.

Despite all the technology around us, it’s amazing how little some things have changed.  Too often, today seems like it was 120 years ago when people talked and thought about “horseless carriages” rather than the new thing that was possible – the car with all the possibilities it opened.

So it was with interest that I read this recent story – “Democrats confirm plans for nearly all-virtual convention

“Democrats will hold an almost entirely virtual presidential nominating convention Aug. 17-20 in Milwaukee using live broadcasts and online streaming, party officials said Wednesday.”

Party conventions have been around since 1832.  They were changed a little bit when they went on radio and then later on television.  But mostly they have always been filled with lots of people hearing speeches, usually from the podium.

Following in this tradition going back to 1832, the Democratic Party is going to have a convention, but we can’t have lots of people gathered together with COVID-19.  This one will be “a virtual convention in Milwaukee” which seems like a contradiction – something that is both virtual but is happening in a physical place?  I guess it only means that Joe Biden will be in Milwaukee along with the convention officials to handle procedures.

Indeed, it’s not entirely clear what this convention will look like.  In addition to the main procedures in Milwaukee, the article indicates that “Democrats plan other events in satellite locations around the country to broadcast as part of the convention”.  I assume that will be similar.

“Kirshner knows how it’s done: He has produced every Democratic national convention since 1992.”

Hopefully this will be different from every convention since 1832 – or even 1992!

Instead of the standard speeches on the screen or even other activities that are just video of something that could occur on-stage, do something that is more up-to-date.  This will show that Biden will not only be a different kind of President than Trump, but that he also will know how to lead us into the future.

Why not do something that takes advantage of not having to be in a convention hall?

For example, how about a walk (or drive, if necessary) through the speaker’s neighborhood (masks on) explaining what the problems are and what Biden wants to do about those problems?

My suggestions are limited since creative arts are not my specialty, but I do see an opportunity to do something different.  It is a good guess that Hollywood is also eager to help defeat Trump and would offer all kinds of innovative assistance.  Make it an illustration of American collaboration at its best.

This should not be an unusual idea for the Biden organization.  Among his top advisors are Zeppa Kreager, his Chief of Staff, formerly the Director of the Creative Alliance (part of Civic Nation), and Kate Bedingfield, Deputy Campaign Manager and Communications Director, formerly Vice President at Monumental Sports and Entertainment.

Of course, the Trump campaign could take the same approach, but they do not seem interested and Trump obviously adores a large in-person audience.  So there is a real opportunity for Biden to differentiate himself.

Beyond the short-term electoral considerations, this would also make political history by setting a new pattern for political conventions.

© 2020 Norman Jacknis, All Rights Reserved

Trump And Cuomo COVID-19 Press Conferences

Like many other people who have been watching the COVID-19 press conferences held by Trump and Cuomo, I came away with a very different feeling from each.  Beyond the obvious policy and partisan differences, I felt there is something more going on.

Coincidentally, I’ve been doing some research on text analytics/natural language processing on a different topic.  So, I decided to use these same research tools on the transcripts of their press conferences from April 9 through April 16, 2020.  (Thank you to the folks at Rev.com for making available these transcripts.)

One of the best approaches is known by its initials, LIWC, and was created some time ago by Pennebaker and colleagues to assess especially the psycho-social dimensions of texts.   It’s worth noting that this assessment is based purely on the text – their words – and doesn’t include non-verbal communications, like body language.

While there were some unsurprising results to people familiar with both Trump and Cuomo, there are also some interesting nuances in the words they used.

Here are the most significant contrasts:

  • The most dramatic distinction between the two had to do with emotional tone. Trump’s words had almost twice the emotional content of Cuomo’s, including words like “nice”, although maybe the use of that word maybe should not be taken at face value.
  • Trump also spoke of rewards/benefits and money about 50% more often than Cuomo.
  • Trump emphasized allies and friends about twenty percent more often than Cuomo.
  • Cuomo used words that evoked health, anxiety/pain, home and family two to three times more often than Trump.
  • Cuomo asked more than twice as many questions, although some of these could be sort of rhetorical – like “what do you think?”
  • However, Trump was 50% more tentative in his declarations than Cuomo, whereas Cuomo had greater expressions of certainty than Trump.
  • While both men spoke about the present tense much more than the future, Cuomo’s use of the present was greater than Trump’s. On the other hand, Trump’s use of the future tense and the past tense was greater than Cuomo’s.
  • Trump used “we” a little more often than Cuomo and much more than he used “you”. Cuomo used “you” between two and three times more often than Trump.  Trump’s use of “they” even surpassed his use of you.

Distinctions of this kind are never crystal clear, even with sophisticated text analytics and machine learning algorithms.  The ambiguity of human speech is not just a problem for machines, but also for people communicating with each other.

But these comparisons from text analytics do provide some semantic evidence for the comments by non-partisan observers that Cuomo seems more in command.  This may be because the features of his talks would seem to better fit the movie portrayal and the average American’s idea of leadership in a crisis – calm, compassionate, focused on the task at hand.

© 2020 Norman Jacknis, All Rights Reserved