Sunday, June 19, 2005


Programming Jobs Losing Luster in U.S. (AP):

AP - As an eager freshman in the fall of 2001, Andrew Mo's career trajectory seemed preordained: He'd learn C++ and Java languages while earning a computer science degree at Stanford University, then land a Silicon Valley technology job.

The 22-year-old Shanghai native graduated this month with a major in computer science and a minor in economics. But he no longer plans to write code for a living, or even work at a tech company.

Mo begins work in the fall as a management consultant with The Boston Consulting Group, helping to lead projects at multinational companies. Consulting, he says, will insulate him from the offshore outsourcing that's sending thousands of once-desirable computer programming jobs overseas.

More important, Mo believes his consulting gig is more lucrative, rewarding and imaginative than a traditional tech job. He characterized his summer programming internships as "too focused or localized, even meaningless."

(Via Yahoo! News - Technology.)

This article addresses real trends but the analysis is poor. It cites forecasts that put in the same bin creative design and development work and routine systems administration and maintenance work. This is misleading. The number of routine jobs must decrease as increased communications and computing capacity and advances in software make it possible to automate those jobs. These advances in productivity are what has made possible the information technology services we now take for granted, from electronic commerce to online banking. Those services would not be possible if computers had to still be lovingly tended by white-coated operators and systems administrators, as they were when I started programming in the late 1970s. To see why, just imagine that the huge server farms at Google or Amazon needing that kind of maintenance. They could not exist that way. Deciding what process runs where and for how long is no longer the job of a human operator, but rather that of an operating system. many other such jobs, from monitoring throughput to detecting hardware failures, which were totally done by people once upon a time, are increasingly automated.

The real question is then what information technology work is going to be needed as the routine jobs are automated away. When I talk about this with people at leading technology-based companies, from search to finance, the conversation always comes back to their difficulty in finding enough creative people with strong analytic and design skills, who can put together reliable solutions to complex problems by putting together techniques from multiple fields, from database systems to signal processing. Some of the consulting jobs mentioned in the article may have that character, but I fear that many of the "versatilists" mentioned by Gartner's VP of research will soon wash out because they and their employers mistake facility with strategic slideware for real ability to solve hard problems.

The article reflects the caricature of computer programmers as unidimensional cube-dwellers that is unfortunately prevalent in the wider culture. The article is indirectly right in pointing to the need for people with broader skills than just programming. But that has always been the case, really. The most successful people in computing are those who solve real problems with computing techniques. Those problems arise most often outside computing, in areas like information access, commerce, finance, biomedicine, aerospace, or entertainment. As several senior people in finance told me, they can train a talented, well-trained computer scientist to build a computational solution for a financial problem much faster than they can train a talented, well-trained financial analyst to program well enough to do the same job. By a well-trained computer scientist here I mean not just someone who can write code reliably and quickly in language X, of course, but rather someone who knows how to think algorithmically about complex problems and how to turn that understanding into a running solution using whatever software tools are available and most appropriate.

The article's "back to the past" part is just so much myth. I don't know what are Matthew Moran's qualifications as an authority on the IT of the 70s and 80s, which he characterizes thus: "The current situation is getting back to the '70s and '80s, where IT workers were the basement cubicle geeks and they weren't very well off [...] They were making an honest living but weren't anything more than middle-class people just getting by." I don't know what is wrong with a honest living, except in a deluded culture in which everyone thinks they have a quick, painless chance of entertainment fame and fortune. It certainly beats a dishonest living with a Federal ending of the Ebbers, Rigas, or Kozlowski kind. But in any case, those "basement cubicle geeks" of the late 70s and 80s started Microsoft, Cisco, Apple, Sun Microsystems, to name just a few, and many of them, not just the founders, are more than just getting by. Of course there is risk. But there are risks in every profession, and isn't it much more fun to try to build something new than to plod along shuffling PowerPoint decks?

Much that I try to be fair, I cannot avoid the suspicion that an article so heavily based on the opinions of people from consulting and policy research may carry their biases in favor of their own type of work against deeper technical work that they may not be qualified to appreciate. Thus the article contributes to the real confusion among college kids and their parents about what their careers will be like. The questions I would ask them are: who is going to build the Intel, the Apple, the Google, the Pixar of 2020? Who is going to tame global warming or design micro-machines that repair diseased cells? Scientists and engineers, or management consultants?


12:41:42 PM