Innovations In Government

The National Association of Counties just concluded its annual mid-winter Legislative Conference in Washington, DC.  I was there in my role as NACo’s first Senior Fellow.

As usual, its Chief Innovation Officer, Dr. Bert Jarreau, created a three-day extravaganza devoted to technology and innovation in local government.

image

The first day was a CIO Forum, the second day NACo’s Technology Innovation Summit and the final day a variety of NACO committees on IT, GIS, etc.

County government – especially the best ones – get too little recognition for their willingness to innovate, so I hope this post will provide some information about what county technologists and officials are discussing.  

One main focus of the meetings was on government’s approach to technology and how it can be improved.  

Jen Pahlka, founder and Executive Director of Code For America and former Deputy Chief Technology Officer in the White House, made the keynote presentations at both the CIO Forum on Friday and the Tech Summit on Saturday – and she was a hit in both.

She presented CfA’s seven “Principles for 21st Century Government”.  The very first principle is that user experience comes before anything else.  The use of technology is not, contrary to some internal views, about “solving” some problem that the government staff perceive.  

She pointed out that the traditional lawyer-driven design of government services actually costs more than user-centric design.  (I’ll have more on design in government in a future blog post.)

She referred to the approach taken by the United Kingdom’s Digital Service.  For more about them, see https://gds.blog.gov.uk/about/   When she was in the White House, she took this as a model and helped create a US Digital Service.

image

She also discussed the importance of agile software development.  She suggested that governments break up their big RFPs into several pieces so that smaller, more entrepreneurial and innovative firms can bid.  This perhaps requires a bit more work on the part of government agencies, but they would be rewarded with lower costs and quicker results.

More generally she drew a distinction between the traditional approach that assumes all the answers – all the requirements for a computer system – are known ahead of time and an agile approach that encourages learning during the course of developing the software and changing the way an agency operates.

By way of example, she discussed why the Obamacare website failed.  It used the traditional, waterfall method, not an agile, iterative approach.  It didn’t involve real users testing and providing feedback on the website.  And, despite the common wisdom to the contrary, the development project was too big and over-planned.

It was done in a way that was supposed to reduce risk, but instead was more risky.  So she asked the NACo members to redefine risk, noting that yesterday’s risky approach is perhaps today’s prudent approach.

Helping along is the development of cloud computing.  So Oakland County (Michigan) CIO Phil Bertolini has found that cloud computing is reducing government’s past dependence on big capital projects to deploy new technology, thus allowing for more day-to-day agility.

Finally Jen Pahlka suggested that government systems needed to be more open to integration with other systems.  In a phrase, “share everything possible to share”.  She showed an example where the government let Yelp use government restaurant inspection data and in turn learn about food problems from Yelp users.  (And, of course, sharing includes not just data, but also software and analytics.)

In another illustration of open innovation in the public sector, Montgomery County, MD recently created its Thingstitute as an innovation laboratory where public services can be used as a test bed for the Internet of Things.
Even more examples were discussed in the IT Committee.  Maricopa County, Arizona and Johnson County, Kansas, both now offer shared technology services to cities and nearby smaller counties.  Rita Reynolds, CIO of the Pennsylvania County Commissioners Association, discussed the benefits of adopting the NIEM approach to data exchanges between governments.

The second major focus of these three days was cybersecurity.  

Dr. Alan Shark, Executive Director of PTI, started off by revealing that latest surveys show security is the top concern for local government CIOs for the first time.  Unfortunately, many don’t have the resources to react to the threat.
Actually, it’s more a reality than merely a threat.  It was noted that, on average, it takes 229 days for organizations to find out they’ve been breached and that close to 100% have been attacked or hacked in some way.  It’s obviously prudent to assume yours too has been hacked.

Jim Routh, Chief Information Security Officer (CISO) of Aetna insurance recommended a more innovative approach to responding to cybersecurity threats.  He said CIOs should ignore traditional advice to try to reduce risk. Instead “take risks to manage risk”.  (This was an interesting, if unintentional, echo of Jen Pahlka’s comments about software development.)

Along those lines, he said it is better to buy less mature cybersecurity products, in addition to or even instead of the well-known products.  The reason is that the newer products address new issues better in an ever changing world and cost less.

There was a lot more, but these highlights provide plenty of evidence that at least the folks at NACo’s meetings are dealing with serious and important issues in a creative way.  

© 2015 Norman Jacknis

[http://njacknis.tumblr.com/post/112058162497/innovations-in-government]

Simplicity In Government?

The idea of simplicity in government is not new. 

Thomas Jefferson was an advocate of “republican simplicity.”  As he wrote in the year before he was elected President:

“I am for a government rigorously frugal and simple…”

Among others in the 18th century, Thomas Paine also was an advocate of simplicity in government.  That was one reason he supported a single house of Congress which would control the national government, rather than the complex system we have. 

Coming closer to our time, the last couple of years have seen a renewal of this idea. “Simpler: The Future of Government” was published in 2013.  The book’s author, Cass Sunstein, was a long time professor at University of Chicago Law School and then ran the White House Office of Information and Regulatory Affairs for President Obama.  In that role, he was a continual advocate for simplicity.

image

Partly, the complaints of the business community have encouraged the desire for simplicity in government regulations.  More broadly, overly complex government operations have also been tied to higher than necessary taxes – so they affect everyone’s pocketbook.

It almost seems that no one can argue against simplification. 

But Syracuse University’s Professor David Driesen argues in a review of Sunstein’s work, for example, that “complexity bears no fixed relationship to costs or benefits.”  Moreover, he points out that there is often a trade-off between simplicity and other values; or looking at it another way, complexity in government is often a result of compromises that are necessary for a law to be enacted.  

He’s also not the first to notice that some who advocate simplicity, attribute simplicity only to those policies and actions that they support on other grounds.

So perhaps simplicity of laws and regulations is not so simple, after all.

But simplicity has many forms.  Is there a way of thinking about simplicity in government that bypasses underlying ideological motivations?

I think so, but it has less to do with debates about political philosophy and law, and more to do with the concrete interactions between government and people – the citizen’s experience.

For that, there are examples and inspiration from outside the public sector.  Perhaps one of the best is Apple, especially as explained in the book, “Insanely Simple: The Obsession That Drives Apple’s Success”.  In this book,

Ken Segall, one of the company’s former marketing experts, points out the many ways that Apple and Steve Jobs worked to simplify the experience of dealing with Apple’s products and services – despite the ways that this might increase the complexity of the problems facing its designers, engineers and other staff.

Although this approach hasn’t been used much in governments in the US, it is not a completely outlandish idea.  Tim Brown, the CEO of the famous design firm, Ideo, proclaimed in his blog that the “The UK Government Shows How to Design for Simplicity” – at least with respect to its Internet presence and digital public services.  

The implication of Apple’s obsession with simplicity is that it starts out by subordinating everything it does to the user’s needs.  And isn’t that what a democratic government is supposed to do too?

image
image

© 2015 Norman Jacknis

[http://njacknis.tumblr.com/post/111280190609/simplicity-in-government]

The Decentralization Of Health Care

Eric Topol is a physician and editor-in-chief at Medscape.  He was interviewed on the Colbert Report last year.  His new book, published last month, has been reviewed in the major newspapers.  Yet this book, “The Patient Will See You Now: The Future of Medicine is in Your Hands”, hasn’t received the attention it deserves.

image

The book is about the future of health care – what’s already happening and what could be coming that’s even better.

Topol’s theme is that new technology and practices make it possible to democratize medical care – to move away from the traditional, paternalistic, hierarchical relationship between doctor and patient.

Hence the title which inverts the traditional words of a medical receptionist that the “doctor will see you now.”

Here’s a sample of some of his key arguments:

“… the world is changing.  Patients are generating their own data on their own devices.  Already any individual can take unlimited blood pressures or blood glucose measurements.”

“We are embarking on a time when each individual will have all their own medical data and the computer power to process it in the context of their own world.  There will be comprehensive medical information about a person that is eminently accessible, analyzable and transferable.”

“Today patients can rapidly diagnose their skin lesion or child’s ear infection without a doctor.  That’s just the beginning.  … your smartphone will become central to labs, physical exams, and even medical imaging; … you can have ICU-like monitoring in the safety, reduced expense, and convenience of your home.”

“The doctor will see you now via your smartphone screen … they will incorporate sharing your data – the full gamut from sensors, images, labs, and genomic sequence, well beyond an electronic medical record.”

The book is very well researched and comprehensively covers all kinds of ways that technology is interacting with and affecting health care.  Dr. Topol provides dozens of examples from all over the field – a laboratory on a chip, smart phones with all kinds of attachments that enable easy measurement of health conditions anywhere, etc.

As a physician, he rightly is concerned about the doctor-patient relationship.   As a sometime patient myself, this is of course of personal interest to me as well.

But more than that obvious reason, why else is the picture he presents so important?

With my perspective on how technology will affect where andhow we will live and work, his story is as much about the decentralization of
medical care as it is about the democratization.

With this decentralization, Dr. Topol envisions the
patient’s home becoming an instant medical lab or even a temporary hospital
wing.  This means that you can dramatically
improve the quality of your health care even if your home is in the
countryside, miles from a major medical center in the center of a metropolis.

And it’s this distance from medical care that frequently
worries those who live in the countryside. 
So when the transformation of medical care becomes more common, one more
traditional disadvantage of rural living that will disappear.

image

© 2015 Norman Jacknis

[http://njacknis.tumblr.com/post/110724183408/the-decentralization-of-health-care]

Big Data, Big Egos?

By now, lots of
people have heard about Big Data, but the message often comes across as another
corporate marketing phrase and a message with multiple meanings.  That may be because people also hear from
corporate executives who eagerly anticipate big new revenues from the Big Data
world.

However, I
suspect that most people don’t know what Big Data experts are talking about,
what they’re doing, what they believe about the world, and the issues arising
from their work.

Although it was
originally published in 2013, the book “Big Data: A Revolution That Will
Transform How We Live, Work, And Think” by Viktor Mayer-Schönberger and Kenneth Cukier is perhaps the best recent
in-depth description of the world of Big Data.

For people like
me, with an insatiable curiosity and good analytical skills, having access to
lots of data is a treat.  So I’m very
sympathetic to the movement.  But like
all such movements, the early advocates can get carried away with their
enthusiasm.  After all, it makes you feel
so powerful as I recall some bad sci fi movies.

Here then is a
summary of some key elements of Big Data thinking – and some limits to that
thinking. 

Causation and
Correlation

When
presented with the result of some analysis, we’ve often been reminded that
“correlation is not causation”, implying we know less than we think if all we
have is a correlation.

For
many Big Data gurus, correlation is better than causation – or at least finding
correlations is quicker and easier than testing a causal model, so it’s not
worth putting the effort into building that model of the world.  They say that causal models may be an outmoded
idea or as Mayer-Schönberger
and Cukier say, “God is dead”.  They add
that “Knowing what, rather than why, is good enough” – good enough, at least,
to try to predict things.

This
isn’t the place for a graduate school seminar on the philosophy of science, but
there are strong arguments that models are still needed whether we live in a
world of big data or not.

All The Data, Not Just
Samples

Much
of traditional statistics dealt with the issue of how to draw conclusions about
the whole world when you could only afford to take a sample.  Big data experts say that traditional
statistics’ focus is a reflection of an outmoded era of limited data. 

Indeed, an example is a 1975 textbook that
was titled “Data Reduction: Analysing and Interpreting Statistical Data”. While
Big Data provides lots more opportunity for analysis, it doesn’t overcome all
the weaknesses that have been associated with statistical analysis and
sampling.  There can still be measurement
error.  Big Data advocates say the sheer
volume of data reduces the necessity of being careful about measurement error,
but can’t there still systematic error?

Big Data gurus say that they include all the data, not just a sample.  But, in a way, that’s clearly an
overstatement.  For example, you can
gather all the internal records a company has about the behavior and breakdowns
of even millions of devices it is trying to keep track of.  But, in fact, you may not have collected all
the relevant data.  It may also be a
mistake to assume that what is observed about even all people today will
necessarily be the case in the future – since even the biggest data set today
isn’t using tomorrow’s data.

More Perfect Predictions

The
Big Data proposition is that massive volumes of data allows for almost perfect
predictions, fine grain analysis and can almost automatically provide new
insights.  While these fine grain
predictions may indicate connections between variables/factors that we hadn’t
thought of, some of those connections may be spurious.  This is an extension of the issue of
correlation versus causation because there is likely an increase in spurious
correlations as the size of the data set increases.

If
Netflix recommends movies you don’t like, this isn’t a big problem.  You just ignore them.  In the public sector, when this approach to
predicting behavior leads to something like racial profiling, it raises legal
issues.

It
has actually been hard to find models that achieve even close to perfect
predictions – even the well-known stories about how Farecast predicted the best
time to buy air travel tickets or how Google searches predicted flu outbreaks.  For a more general review of these
imperfections, read Kaiser Fung’s “Why
Websites Still Can’t Predict Exactly What You Want
” published in Harvard
Business Review last year. 

Giving It All Away

Much
of the Big Data movement depends upon the use of data from millions – billions?
– of people who are making it available unknowingly, unintentionally or at
least without much consideration.

Slowly,
but surely, though, there is a developing public policy issue around who has
rights to that data and who owns it. 
This past November’s Harvard
Business Review
– hardly a radical fringe journal – had an article that
noted the problems if companies continue to assume that they own the
information about consumers’ lives.  In
that article, MIT Professor Alex Pentland proposes a “New Deal On Data”. 

So Where Does This Leave
Us?

Are
we much better off and learning much more with the availability of Big Data,
instead of samples of data, and the related ability of inexpensive computers
and software to handle this data? 
Absolutely, yes!

As
some of the big egos of Big Data claim, is Big Data perfect enough to withhold
some skepticism about its results? Has Big Data become the omniscient god? – Not
quite yet.

image
image

©
2015 Norman Jacknis

[http://njacknis.tumblr.com/post/110070952204/big-data-big-egos]