6

Harder Core Than Thou

 3 years ago
source link: https://qntm.org/core
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Harder Core Than Thou

2010-12-31 by qntm

We are still that hardcore.

Forty to fifty years ago, in the 1960s, I.T. professionals worked with systems which, by modern standards, were close to useless. (The term "information technology" didn't exist, but let's call the computer users of that era "I.T. professionals" regardless.) These systems were cutting edge for their era, which is respectable, but they lacked much which we take for granted. For example, output was sent to a printer, not a monitor, and thereby consumed ink, paper and a great deal of time. Any serious computer system occupied an entire room, if not a floor, of your building. It required a team of people to maintain. It was inscrutable and difficult to handle, because any concession to usability carried a real cost in processor cycles and this was the era when computer processor time was more valuable and difficult to replace than human programmer time. You were lucky to get one compile per day, so you wrote your code out by hand, punched it into cards by hand or with some kind of mechanical machine, and avoided dropping them on the floor because re-ordering them was such a monumental challenge. It was the Bronze Age of computing or whatever metaphor you want to use. Programming was difficult back then and you had to meticulously validate every line of code you wrote, and have it cross-checked, because a failed build cost serious time and ultimately pushed back serious deadlines.

We get it. This is just how computing -- of necessity -- worked in that era. And in order to accommodate these limitations, people wrote good code. Code that would run continuously without hiccup or malfunction for literally forty years -- provided the underlying hardware didn't change significantly. And they wrote that code first time. Because they had to. Code that was close to perfect. The equivalent of walking a tightrope.

Those were great people. No question. They accomplished great feats and created great software, some of which is still operating to this day.

The problem is that people point at the leftover COBOL and processes from that era and lament at what software engineering has lost. The statement, whether implicit or explicit, is that, in this era of comforting, forgiving, error-tolerant, problem-catching IDEs, and of programming languages which were - horror of horrors - built to be more usable they are efficient, we have somehow lost the capability to write code which compiles first time. That we have lost the ability to write code which will run continuously for forty years without a hitch. That we are no longer hardcore.

I contest the "forty years" statement. We are observing from a biased perspective. The only code from forty years ago which is still running is the only code which is still running that was written forty years ago. How much code from that era is dead and erased by now, replaced with something more powerful and maintainable? Who knows? We don't remember that stuff. Because it was erased. What's the percentage, exactly?

Of course a computer program which runs without fault will continue to run without fault indefinitely. It just needs the hardware to support it. If the hardware manufacturers are prepared to continue to provide compatible hardware -- which we are -- then any proven program will be able to run indefinitely. It's not like some memory leak which was negligible after thirty-nine years will suddenly become fatally significant after forty. We work with basically mathematically perfect systems which, by now, have tolerance for hardware faults. If the program is good, and the hardware is good, there will be no problem.

And people write forty-year software all the time. Even right now. We have a tolerant approach to hardware failure these days, of necessity, but that doesn't mean that the software running on that metal is bad. People write software all the time which goes on to run, largely uninterrupted, for five years. Left alone, and properly supported, that software will run for another thirty-five years. No problem. It's not actually that difficult to write good software. The hard part is reserving the forty-year period needed to prove that it's good.

And I contest the "programming greats" statement -- the assertion that they don't exist anymore. There is a well known piece of prose about Real Programmers. The subject's name was Mel. (Mel wrote unmaintainable software. The technology to write unmaintainable software has not been lost. But that's beside the point.) Mel wrote highly efficient code, and was able to do so thanks to his intimate and frightening understanding of the underlying system. People don't write code like that anymore.

That doesn't mean that we can't.

The technology to be incredibly freaking smart has not been lost. People like Mel still exist, except that they fry bigger fish now, and create highly performant software on an entirely different level. The technology to make software which is effectively perfect has not been lost. People who can do it, first time, still exist. But the need to do that has gone. And software which is one hundred percent perfect is not what the customer wants, unless the customer is NASA. And NASA does have software guys. And they do deliver. The reality is that in the majority of situations, what the customer wants is 99% perfect software now, rather than 100% perfect software delivered five years from now at five times the price. The natural state of I.T. right now is one of continuous rapid change, and I.T. consumers are used to this concept. They're used to upgrading when the new fixes are ready. The requirements have changed, and the providers have changed to meet those requirements.

This is not weakness. It doesn't make modern software engineering an easier or wussier task. Don't confuse a relatively low quality of output with low effort or low ability. Look at the commensurate rapid pace and high complexity. It's a trade-off which is made consciously.

If we had to write the Right Stuff, we could. We, programmers as a society, are still That Good. We are still idealists, and the ideal is still attainable. But we simply choose not to, because our clients are pragmatists, and they are the ones for whom we work.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK