I recently ran across two very interesting articles (Computer Science Education: Where Are the Software Engineers of Tomorrow?, Who Killed the Software Engineer? (Hint: It Happened in College)) discussing how the current university education has become inadequate in terms of producing highly qualified software engineers and developers.As a software engineer myself, I think I would have to agree with Dr. Dewar’s view detailed in the above two articles.

Like any kind of science, the essence of computer science education was not to teach you any particular languages, but to teach you the theories by which all computational tasks abide.

A typical software engineer today is very different from one even just a couple of decades ago. Back then, creating a computer program usually required some very intimate knowledge of both the software and the hardware. Of course we had no choice back then as we lacked the modern high level languages (e.g. Java, C#) and the hardware resources were usually very limited. Back then we valued efficient algorithms in terms of both footprint and efficiency as a 600K versus 60k application usually means one can fit into the main memory and one cannot. And similarly, an inefficient algorithm usually led to an unusable application.

Things have arguably changed quite a bit nowadays. Personal computers today are becoming ever more powerful and the limit imposed by storage is rapidly disappearing for all practical matters. Nobody would even notice the difference of an inefficient application taking one second to run whereas the same application could be optimized to run a thousand time faster. A 10Meg application and a 1Meg application would rarely pose any problem as today’s computers can easily handle multi-gigabytes of working sets.

And with computers being as ubiquitous as they are today, almost every one knows how to operate a computer and does what his or her heart desires to a certain degree. So naturally, the technological savviness of an average developer is far less than that ten or twenty years ago.

This situation was made even worse in a market economy today. Most companies’ job postings require the knowledge and experience of certain computer languages and without such credentials (e.g. for a fresh colledge graduate) a candidate is simply not considered. Even though we all know that a good developer in one language typically can master any given languages within a very short period of time and be good at them as well. Unfortunately, as human nature, we tend to emphasize more on the surface value that we can see.

The society we live in is one that is driven by demand. Given the corporate culture just mentioned, more and more universities, especially the lesser known ones started to put more emphasis on what is needed in the market and the fundamentals (e.g. Assembly, C/C++) became less desirable to teach or learn as such fundamentals hardly ever translate into the market place value.

And it is only to become worse, as companies like Microsoft make the application development environment easier and easier to use, wring programs becoming seemingly more and more trivial. Some people even believe that anyone, with some training, can easily master a computer language and thus becomes a software developer. There might be some truth to that as far as the "surface value" is concerned, but this is rather dangerous as there is a lot more to writing good programs than being able to drag and drop.

Be Sociable, Share!