A recent article in InfoWorld about the shrinking population of older IT workers hit me especially close to home. As a former programmer — pardon me, software engineer — who left the field in my mid-forties, I was interested in learning why the IT industry tends to shed its older, more experienced workers. According to the article’s author, Lisa Schmeiser, the reasons for this phenomenon are not what you might think.
For example, while age discrimination is alive and well, older workers, in general, have lower unemployment rates and higher salaries compared to their younger counterparts. In fact, the more money you make, the less likely you are to be unemployed. (This is true throughout the labor pool, not just the IT sector.) This would suggest that the industry should be well-populated with middle-aged techies. But apparently that’s not the case. Schmeiser writes:
A late-1990s study by the National Science Foundation and Census Bureau found that only 19 percent of computer science graduates are still working in programming once they’re in their early 40s. This suggests serious attrition among what should be the dominant labor pool in IT.
The idea that IT shops are filled with gray-bearded Unix geeks is a relic of the past. Today those same organizations are more likely to be populated with twenty-something Linux programmers.
Schmeiser cites some possible reasons the industry is shifting to a younger workforce, including a changing IT culture, the perceived lower price-performance of older workers, the devaluation of technical experience and skills, and the changing nature of the IT job. In fact, all of these are related, and have a lot to do with the shift from an engineering-focused culture to a business-focused culture as IT companies mature. In such an environment tech workers become commodities, with the older ones tending to become obsolete.
The attitude is summed up by this gem of a quote from former Intel CEO Craig Barrett, who was reputed to have said: “The half-life of an engineer, software or hardware, is only a few years.” The implication here is that years of experience with one set of technology — programming language, hardware architecture, what have you — is not applicable to the next job, so there is little reason to value such experience.
The result is that the more skilled, more specialized, and more expensive workers tend to get laid off first during a precipitating event, like when a company downsizes or shifts to a new set of products and technologies. Absent a layoff, the workers themselves often leave of their own accord as they are forced to accommodate new responsibilities or change their work habits. Schmeiser concludes:
Thus, the harsh reality may be that IT jobs — at least as they’re defined now — may be perpetually entry-level.
The entire text is worth a read, especially if you’re a young programmer or engineer who might be wondering what your career has in store for you. Of course, a follow-up piece on how to manage such a career would surely be appreciated. But that’s likely to require a much longer article.