In The Cloud?

We’ve been hearing about the promise of cloud computing for some
time. There are finally enough companies that have used the cloud to
have experienced the reality of cloud computing and learned some
interesting lessons.

So, recently, at its March monthly dinner meeting, the local chapter of the national association of CIOs (SIM) had a panel of IT executives discuss the migration to the cloud:

  • Len Peters, the University Chief Information Officer and Associate Vice President at Yale University
  • Larry Biagini recently retired as Vice President and Chief Technology Officer of GE
  • Jeff Pinals, Senior IT Manager for Enterprise Financial & HR Applications at XL Catlin
  • John Hill, COO of Virtustream, with responsibility for Cloud Platform Delivery and Global Data Center Operations.

I
moderated the panel.  Since there’s been so much written on this
subject, I’ll just focus on three revealing, yet not widely reported,
insights.

Security

There has been obvious
concern about security in the cloud, especially when a large amount of
data is held off premises.  Larry Biagini pointed out that those
security issues are already shared with enterprises that do not use the
cloud.

The old security moat around the enterprise is not a
modern defense in a world in which all computers are effectively
connected to each other and both employees and even trusted customers
are executing transactions through their personal devices.  A more
intelligent approach to security prepares the enterprise for its
migration to the cloud – whether it be the public cloud or a cloud that
someone thinks is private or hybrid.

image

The Cloud Project

Some panelists described the
process they used to select a cloud vendor and migrate to cloud
computing.  The tendency was initially to think that the whole story was
about following the usual steps in any IT project.  But Jeff Pinals
pointed out that migrating to the cloud is more than just another IT
project.  A good example is understanding how cloud computing will may
have an uncomfortable challenge from the organization’s culture – and
planning to address that issue.  Specifically, even in or perhaps
especially in companies with the best IT shops, non-IT managers are used
to a high degree of flexibility and accommodation to all sorts of
customizations.  That’s less likely to happen with cloud computing where
a SaaS (software-as-a-service) vendor cannot efficiently run the
operation by being so accommodating.

Return From The Cloud

Although
cloud computing is still a new experience for some companies, already
the question has been raised as to where this leaves an enterprise once
they’ve made the move.  The issue was highlighted by the news just
before this panel spoke that, after using Amazon Web services since it
started many years ago, Dropbox was leaving the Amazon cloud and
creating its own network and data centers.  See, for example, “Why Dropbox dropped Amazon’s cloud
published the day of our meeting.  It’s worth noting that even with the
large resources and technical talent of Dropbox, it took them more than
two years to make this re-migration from the cloud.

The panelists
indicated that there may be several reasons why moving or dropping out
of some other company’s cloud service would be desirable.  Perhaps it is
a competitor or potential competitor.  Perhaps its service wasn’t what
was expected and the decision makers were so burned by the experience
that cloud computing is off the table for now.

In Dropbox’s case,
perhaps the company is just sizable enough that the value added and
extra cost of using a cloud computing vendor no longer made financial
sense.

Whatever the reasons, after a couple of years, an
enterprise’s IT staff will also have migrated to a different set of
skills when someone else is handling the data center and related
operations.  The panelists noted that loss of data center skills may be
irreversible, at worst, or cost an enormous amount of money to rebuild,
at best.

John Hill ended this discussion that the move to cloud
computing requires a change in orientation about this loss.  Referring
to another utility we take for granted, he asked: Do you generate your
own electricity? Do you know how?

We need to realize that the
benefits of cloud computing have consequences.  Trying to return from
the migration is a bit like coming back out of the real clouds without a
parachute 🙂

image

© 2016 Norman Jacknis, All Rights Reserved

[http://njacknis.tumblr.com/post/141898219466/in-the-cloud]

Beyond The Craze About Coding

In last week’s post on the Coding Craze, I referred to the continuing
reduction in the need for low level coding – even as what is defined as
low level continues to rise and be more abstract, more separated from
the machines that the software controls.  I even noted the work in
artificial intelligence to create programs that can program.

All
of this is a reflection of the fact that pure coding itself is only a
small part of what makes software successful – something that many
coding courses don’t discuss.

Many years ago in the programming
world, there was a relatively popular methodology named after its two
creators – Yourdon and DeMarco.  While it has been mostly been
remembered for its use of data flow diagrams, there was something else that it taught which too many coders don’t realize.

There
is a difference between what is logically or conceptually going on in a
business and the particular way it is implemented.  Yourdon asks
software designers to first figure out what is essential or as he put it:

“The
essential system model is a model of what the system must do in order
to satisfy the user’s requirements, with as little as possible (and
ideally nothing) said about how the system will be implemented. … this
means that our system model assumes that we have perfect technology
available and that it can be readily obtained at zero cost.  [Note: this is a lot closer to reality today than it was when he wrote about zero cost.]

“Specifically,
this means that when the systems analyst talks with the user about the
requirements of the system, the analyst should avoid describing specific
implementations of processes … he or she should not show the system
functions being carried out by humans or an existing computer system. …
these are arbitrary choices of how the system might be implemented; but
this is a decision that should be delayed until the systems design
activity has begun.”

Thinking this way about the world operates
and how you want it to operate is the start of software design.  
Software design really has two, related, meanings – like C++
overloading.  

First, there is the design of the architecture of
the software and overall solution.  Diving into coding without doing
this design is what leads to persistent and embarrassing bugs in
software.  The internal design is also necessary to avoid spaghetti code
that is hard to fix and to improve performance, even in these days of a
supposed abundance of compute resources.

image

Second, there is the design of the interface that the user sees –
with all the things to worry about that we associate with the “design
thinking” movement.

(One way of planning software is to imagine
the software designer is a playwright, who is responsible for all the
parts of the play aside from one, the user’s part.  I guess this is more
like improv than a play, but you get the idea 🙂 )

So maybe in
addition to the coding class, the wanna-be software developer should go
to improv or drama school.  That’s a more likely path to knowing how to
generate the WOW! reaction from users that makes for software success.

image

© 2016 Norman Jacknis, All Rights Reserved

[http://njacknis.tumblr.com/post/141545808566/beyond-the-craze-about-coding
]

The Coding Craze

A computer coding craze has taken over the country. Everywhere you turn, public officials from President Obama on down in the US and around the world seem to be talking about the need to train folks in computer coding.

Governors
and Mayors are asking their school systems to teach students how to
code instead of learning other subjects. Many people who had little
previous interest in computers or software – except as blissfully
ignorant users – have signed up for, often expensive, courses on
programming.

image

It’s not just in California or Seattle or Austin
(pictured), but back in traditionally less high-tech places on the East
Coast as well. Recently there were stories about coding classes in the
Borough of the Bronx in New York City and as far south as Miami.

There
may be good reasons to take these courses. A bit like the courses that
schools used to teach about how the car combustion engine worked,
learning to program may help people better understand how computers
sometimes operate.

As with any creative activity, at the start,
you can get a sense of accomplishment when witnessing your software
creation come to life — once most of the bugs are eliminated 🙂 You can
even reprise this feeling under special circumstances later on in your
career. But much of the work of coders can, in the long-run, become mind
numbing.

The opportunity to design and create a great new app is
like being invited to paint the Sistine Chapel. But the more frequent
opportunities are like being invited to paint someone’s apartment.

Don’t
get me wrong. As a long time software developer myself, I can say there
are many satisfactions for developers who have both the knack and
passion for software. But people who don’t have those attributes and
just do it like any other job will be frustrated too easily.

And, honestly, even the positive side of life as a developer is not what is primarily driving this coding craze.

Much
of the interest by public officials (like the governor of Arkansas in
the picture) as well as the people enrolled in coding classes is based
on their belief that these courses make possible employment
opportunities that will endure for decades in a world in which
traditional jobs have been automated or shipped overseas. Will it?

image

Surely, some people are going into coding just for an immediate bump in short-term income. Studies
on the relatively new phenomenon of coding bootcamps seem to support
this notion – that is for the 65% of students who graduated and are
working in a programming job. Even in those cases, the best results were
for students who graduated from the more expensive and selective
programs.

Yet, on balance, count me as a skeptic. I think this
craze is, well, crazy. In the long run, I don’t think coding courses for
the millions will lead to the affluent future and lifelong careers that
many proponents envision.

First, as I’ve alluded to, these are
not jobs that everyone who is learning to code will find satisfying. We
may be too early into this craze to know how many people go into the
field and last for more than a short while, but I’d expect the dropout
rate to be high.

Second, there is the low level nature of what is being taught – how to write instructions in a currently fashionable language. While most of the coding courses focus on currently popular languages, like Ruby and Javascript, many of their students do not understand how popularity in languages can come and go quickly.

Some
languages last longer than others do, of course. Through sheer inertia
and unwillingness to invest, there are still some existing programs
written in old computer languages, like FORTRAN and COBOL. But there
aren’t that many job openings for people coding those old languages.

Wikipedia lists a variety of languages that have been created over the last three decades, approximately one a year:

1980 C++
1983 Ada
1984 Common Lisp
1984 MATLAB
1985 Eiffel
1986 Erlang
1987 Perl
1988 Tcl
1988 Mathematica
1990 Haskell
1991 Python
1991 Visual Basic
1993 Ruby
1993 Lua
1994 CLOS (part of Common Lisp)
1995 Ada 95
1995 Java
1995 Delphi (Object Pascal)
1995 JavaScript
1995 PHP
1996 WebDNA
1997 Rebol
1999 D
2000 ActionScript
2001 C#
2001 Visual Basic .NET
2003 Groovy
2003 Scala
2005 F#
2009 Go
2011 Dart
2014 Swift
2015 Rust
2016 ???

So if all they learn today is the syntax of one language and lack a
deeper education, they may find that one skill to fall out of favor.

Indeed,
many of those students aren’t even being taught about the different
kinds of programming languages – even classes of languages vary in
popularity over time.

Instead, they are usually learning imperative languages, especially with a focus on low level procedures.

It
is also not clear that the popular languages are the best ones to even
teach basic coding, never mind understand software more generally.

Even
the idea that any language is good enough to educate students about how
computers work is misleading. Different classes of languages lead to
different ways of thinking how we can represent the world and instruct
computers.

And finally, the trend in software, in fits and starts,
has been to reduce the need for low-level programming. Originally, it
was a move away from “machine instructions” to higher level languages.
Then there were various tools for rapid application development. Today,
there is the Low-Code or even No-Code movement, especially for Apps.

You’ve
heard of the App Economy, another part of the promised job future?
Putting aside the debate as to whether the app phenomenon has already
peaked, with these low-code tools, fewer coders will be needed to churn
out the same number of apps as in the past.

And then over the horizon, computer scientists have been busy “Pushing the Limits of Self-Programming Artificial Intelligence” as one article states in its title.

Finally,
with this background, pure coding itself, even in past years, was only a
small part of what made software successful. And a successful long term
career in software requires an understanding of what goes beyond
coding.

But this is enough in one post to get many people irked, so I’ll save that for a future post.

© 2016 Norman Jacknis, All Rights Reserved

[http://njacknis.tumblr.com/post/141088761737/the-coding-craze]

New Funding For Libraries

The Metropolitan New York Library Council was invited to take part today in a working conference of the New York State Assembly Standing Committee on Libraries and Information Technology on the digital divide, broadband and especially library funding.  We — Nate Hill, Executive Director, and myself as Board president — took the opportunity to address the large and developing problem of how to fund libraries in this century.

We noted that these subjects are all part of a larger problem.  Libraries are delivering more and more digital content and services to larger numbers of people, especially those who are on the wrong side of the digital divide or who still need help navigating the digital economy.  These increased services require much higher bandwidth than most libraries can now offer, which puts an unfair and arbitrary cap on how well people can be served.

image

While the need for broadband in libraries and its value to the community is clear, what has been unclear and, at best, sporadic is the financing to make the broadband-based services possible.  When legislators only thought about libraries as just another one of the cultural resources for the state, library funding was limited to a piece of cultural funding.

Now that libraries offer a broader array of services and can offer even more in a digital broadband era, the funding should also be more diversified. 

• To the extent libraries support entrepreneurs and small business as both location for innovation and “corporate reference librarian”, a piece of the economic development budget should support libraries.• To the extent libraries support students, especially with homework help and after school resources, a piece of the very large education budget should support libraries.• To the extent libraries support workforce development and are the most cost-effective, often the only, way that adult learners can keep up their skills to be employable, a piece of the workforce development and public assistance budgets should support libraries.• To the extent libraries support public health education, a piece of the health budget should support libraries.

There are other examples, but the strategy is clear.  Library funding needs to come from a diverse set of sources, just as a good investor has a balanced portfolio and doesn’t have all the money in one stock.

Of course, in the longer run, public officials will recognize the role of the library as the central non-commercial institution of the knowledge age that we are entering.  As such, perhaps the permanent funding of libraries should be a very light tax on the commerce going to through the Internet to support the digital public services that are provided by libraries.

To some degree, the principle of basing support for library broadband on telecommunications revenues has been established with the Federal E-Rate program.  But the amounts are relatively small and the telecommunications base is traditional phone service, which is diminishing, not the Internet which is growing.

Whatever the source of funding may turn out to be, libraries need a consistent source of funding that grows with the demand for their services in this century.

image

© 2016 Norman Jacknis, All Rights Reserved

[http://njacknis.tumblr.com/post/140390575384/new-funding-for-libraries]