We’re often reminded how many computer users are still a little “iffy” on some basic computing concepts—particularly those who were already out of school when PCs first hit. We make no judgments…it’s all a matter of where one spent one’s misspent youth.
Software is perhaps one of the squishier terms for people to grasp. In a nutshell: “hardware” is a physical product one buys to have a computing system (the stuff that comes in boxes—the tower or laptop, mice, printers, routers, etc.). “Software” is the stuff that makes that hardware actually do something. By definition, it’s tough to visualize or touch (save for the box some of it used to come in, or the diskette or CD it is distributed on).
A rather gross example would be Frankenstein’s monster: hardware is the assembled body parts, but it took a mysterious process to bring it to life. A cadaver is useless without a heart beating and blood flowing.
That’s why an obsolete computer is described as a “boat anchor” once it’s too out-of-date to run current software; the hardware may work for many years yet, but it’s rendered worthless by the pace of software change, which only supports the latest and greatest. (This is part of the perverse economy of computing: it’s too expensive to make software for every edition of every machine, yet it’s complexity necessitates constant change.)
In the earlier days of personal computing, hardware was more expensive (up to $5,000 for a business PC and peripherals). Clients who claimed to be budget-conscious often surprised us by running out and buying such hardware, however—then they were shocked that they weren’t done. Without software, the box was useless. But it was a big hurdle for them to spend hundreds or thousands more for a few floppy disks. Software has always been the Rodney Dangerfield of the computer business; it’s intangible nature has made it hard for most users to appreciate or value.
As it applies to consumers, software comes in two basic categories: operating systems and applications.
Operating systems (like Windows, Linux, or the Mac OS) are millions of lines of code to make all the hardware components work together. It’s basically the framework of a house, before all the finishes are applied; it looks like a dwelling, but you’re far from being able to throw a party.
Applications are software programs designed to actually do things. With the advent of smart phones, everyone’s more familiar with the concept of an “app” (and the prices are cheaper than ever). Keep in mind, though, each application represents several man-years of time to develop and make “idiot proof.” Huge apps like Microsoft Office represent exponentially more labor (yet after this many decades, one wonders why more progress hasn’t been made).
Since day one, users have complained “why do they keep changing everything?” There is the crass commercial component—the need for cash flow. But a lot of it has to do with that Moore’s Law you’ve probably heard of. If the core engine of the computer—the processor—changes, software has to change to use it. This sets up a ripple effect through every “higher” layer of software. The programming tools have to change, which means the apps have to be revised or rewritten to work properly with them. This constant churn (sometimes feeling more like “one step forward, two steps back”) has many wondering if we’ve made as much progress as we should have in thirty years of personal computing.
All purchasing decisions stem from the operating system platform, as apps have to be created differently for each one. Your choice of platform represents as much a “religious” preference as anything (i.e., they’re the same, only different). Like many other areas of our economy, we have very limited choices at present as markets have consolidated. We’ll go no further into that now, but bear this in mind: before Microsoft, IBM was a corporate behemoth and de facto standard-setter for decades. By the late 1980’s, they were already appearing to be a dinosaur in a tar pit, flailing about desperately with diminishing returns.