The decade of the 1980s ushered in many new revolutionary changes that affected every person in this country not living in a shack in remote wilderness area of Montana. Some of these changes included witnessing the new found fame of the denim overall (and nothing else) clad rock group Dexy’s Midnight Runners, electing an actor to the office of President of the United States of America, and having a surprisingly large percentage of the world running around screaming, “Where’s the beef?”
While all of these events are important to the evolution of the planet, this decade was witness to one of the most critical single advancements in the computer industry. Without intending any disrespect to the Pac Man stand-up video game, the world was never the same after the introduction of the first Personal Computer.
While various computer systems were available to the general public before the “Personal Computer”, many potential customers were turned off by the disclaimer on the box stating “some assembly required.” For just about any other product in the known world this would mean getting out a Phillips head screw driver and an adjustable wrench. Assembling a computing system of the time required a soldering gun, a high precision metal lathe, and a Masters degree in Electrical Engineering.
IBM changed all of this with the introduction of its Personal Computer. The whole system was already assembled and loaded with the state of the art operating system known as DOS. All that a new user has to do is to take it out of the box, plug it in, and turn on the power switch. It couldn’t be any easier. Or at least that was the theory.
From the hardware perspective, the Personal Computer helped standardize computer parts. Since IBM didn’t want to be in the business of manufacturing every component that went into their systems, they helped create standards. This allowed different components to be swapped in a single system. For example, if you were running out of space on the hard drive, you could go to the computer store and buy a bigger drive. After taking the case off the computer, you simply swap the old and new drives. After getting the case back on you turn on the power only to see a blank screen come up. The next step is to put the old drive back in, only to get the same blank screen when it boots up. Finally, you go to the nearest drinking establishment and order a double shot of whiskey as you come to realize the last six months of work is trapped inside an uncooperative computer component.
Pretty soon there were a few computer component manufactures that got this idea in their heads to build their own Personal Computers. Well, IBM had already seen this coming, and had taken steps to prevent this from happening. They built the Personal Computer around a single chip named BIOS that only IBM manufactured. Without this chip, all the other hardware was not able to talk to each other. In effect, you could not build a Personal Computer unless IBM let you.
This situation is quite similar to the safe guards put in place in the movie, “Jurassic Park” to keep the dinosaurs from reproducing. And we all know how well that worked out. With the exception of countless bad sequels, the exact same thing happened in the computer industry. One of IBM’s rival companies figured out the exact functionality of the BIOS chip and constructed their own version. This processes of reverse engineering opened up the electronic flood gates. Anyone and their dog could now build their own Personal Computer with only the basic understanding of what was happening inside the computer.
While IBM didn’t really seem happy about the entire situation, countless new computer companies were cheerfully popping up overnight. They didn’t all survive the test of time, but companies such as Dell and Compaq expanded and eventually came to dominate the industry. This created fierce competition in the industry. The costs of systems was constantly coming down while their speed and capacity was improving. This behavior benefited consumers by having any system they purchased be obsolete by the time they drove home and took it out of the box.
The development of the Personal Computer changed the way the world looked at electronic devices. For better or worse, everyone had to have a computer to get through their daily lives. Even when they made our lives more complicated it seemed like a good idea at the time to do everything on a computer. Well, that’s all for this week-I’m off to go finish my game of computer solitaire.
While there are many, many ways in which computers have been used to make the world a better place to live, the 1970s was witness to the scientifically verifiable best possible use of this emerging electronic technology. No, I’m not talking about the perfection of the Andy Gibb robot duplicate (which ranked 5th over all), but rather the birth of video games.
Up until this point in time, playing games generally involved social interaction and physical activity. In retrospect, it’s hard to believe that people even bothered with this type of behavior. But this was a time in the history of America when people really were not too concerned with their own health or the general state of the planet. As evidence, many people smoked cigarettes and the Bee Gee’s music was allowed to propagate with little or no government intervention. We didn’t realize back then that the best way to preserve our bodies and minimize physical injury was to sit inside and dedicate large periods of time alone sitting in front of some type of computer controlled output device.
The first commercially successful video game system was named Pong. This simulation was an exact electronic replication of the game of tennis. The only minor components of the sport removed included: rackets, nets, gravity, wind resistance, the third dimension, and of course, Arthur Ashe. And the ball was square instead of spherical. Despite these limitations, the game of Pong was a tremendous success. This goes to show how a well-run marketing department can make or break the release of a new product. The lead computer programmer for the company described the game as, “two sticks that can move up and down bouncing a ball back and forth.” The packaging of the product in stores proclaimed the game of Pong to be, “Virtual reality fourth dimension alien space tennis with real lasers.”
The next major video game system to capture the hearts and minds of the American public was the Atari 2600. Unlike the game of Pong, this setup allowed for different game cartridges to be inserted into the main unit. When people grew tired of their existing game collection, they could just drive out to the nearest retail store and buy a few more.
This system also had the advantage of separating the hardware and the software components of the video game system. Which meant that any Tom, Dick, and Harry could get together in their garage and start making their own video game titles. When this phenomena occurs the results can revolutionize the world. But usually it meant they came out with a few very mediocre titles. While several impressive game titles ran on the Atari 2600, countless forgettable counterparts would sit next to them on the shelves of the store. Unfortunately, consumers had a hard time determining which of these games were worth buying as they all claimed to be some slight variation of “alien space tennis.”
The Atari 2600 era largely ended with the introduction of the Commodore 64. While not exclusively a video game system, this system included a keyboard and optional floppy disk drive. This meant that anyone who owned a Commodore 64 could write their own programs and distribute them on a floppy disk. Potential computer nerds didn’t even need to work from their garage anymore-code could be written from the comfort of their own living rooms without creating a big mess of wires, circuit boards, and duct tape. In addition to rampant unchecked piracy, this system also led to some of the most well designed video games the world has ever seen. I’ll always lovingly remember my Commodore 64, despite the fact that my mom threw it out when I was away in college.
The video game industry has been continually improving their systems to keep up with the demands of consumers. While these “consumers” do not have a centralized leader or clear command structure, intelligence reports indicate they demand games that are colorful, make interesting noises, and inspire them to remain motionless for indefinite periods of time even when it is nice enough to go outside and play. The computational resources needed to operate these games is quite impressive. One recent study reported that if all the processing power from all the computers running video games could be harnessed at once, the resulting system would be powerful enough to master the game of chess, sequence all the DNA of the human race, or locate Jimmy Hoffa. Since that isn’t going to ever happen you might as well go to the store and buy “Ultimate Alien Space Tennis 7.”
After the concepts involved in the Eniac computer were proved to be a success, people started asking a lot of questions about the future of computational devices. “What else can it do?”, “Can it be made smaller than 200 tons?”, and “Does it come in blue?” were just a few of the many, many thoughts people had about the topic.
The 1950s and 1960s were quite exciting times for the development of computers. Successors to the Eniac system allowed researchers to gain valuable insights into mathematical and sociological functions of our world. For example, the companies who won large and profitable government contracts to build and maintain computer systems quickly learned to construct their systems with large panels of blinking lights. While a few of the lights actually corresponded to actual parameters related to the machinery such as “power”, “something is going on inside”, and “an unknown error has occurred at location at 57EE:009B”, most of the lights were designed to blink on and off in such a way that was aesthetically pleasing to the eye.
This functionality proved to be critical when top level defense department officials or members of congress stopped by to see the final results of their considerable expenditures. After a tour of the facilities, the gentlemen would light up their pipes, puff out their chests, and confidently spew out random pleasantries like “Good work men!”, “This is EXACTLY what we need to beat the Commies!”, and “I don’t know about you, Bob, but I think it needs more blue lights.” Eventually the contractors brought in interior decorators during the hardware design phase to coordinate the color schemes of the systems. Some of the individuals who programmed the computers started to develop software that did nothing more than make the lights blink in the most interesting sequence possible.
Eventually blinking light technology reached a limit and computer designers were forced to explore other avenues. An in depth investigation revealed that in addition to changes in light intensity, the human eye responds positively to periodic rotational motion. Armed with this knowledge, computers were enhanced with state-of-the-art tape drives. While containing little, if any, adhesive properties, these devices were used to store and retrieve information on a long and thin strip of material capable of holding a magnetic charge. The constant back-and-forth motion provided a convincing illusion of productivity. Often times the managers of these facilities would be giving tours of the computer facility while the rest of the office was busy in the break room building elaborate paper fortresses with rolls of scotch tape and reams of used continuous feed paper.
In addition to the blinking lights and reel-to-reel tape devices, each generation of computers was becoming smaller and more powerful than its predecessor. The development of the integrated circuit allowed designers to eliminate bulky vacuum tubes. These types of technological advancements allowed for the same amount of computational power to occupy a continually shrinking volume of space. This phenomena is often times referred to as the Carnie Wilson effect.
All of this visual stimulation associated with computing devices led the general public to assume that while computers were useful in some abstract manner, they would eventually become sentient and bent on destroying the human race. While it isn’t mathematically feasible to prove such an event will never happen, many popular films of the era encouraged this concept. One prime example is the movie “2001: A Space Odyssey.”
After successfully sending its crew half way across the solar system, HAL, the talkative onboard computer system, decides to fling the crew into outer space one at a time just because he had nothing better to do. In all reality that is not how computers of the day would have worked. The worst thing that could have happened was the “fling yourself out the airlock one at a time” light would have lit up. Eventually the crew would have realized this was a computer error and not in the best interest of the mission. If this occurred before everyone followed the instructions one of the remaining crew members would have put a small piece of tape over the light and ignored it for the duration of the movie. I believe this would have all been clearly explained if a logistical error during the final editing process hadn’t caused extensive quantities of a completely different film to accidentally replace the intended ending of the movie.
While the 1950s and 1960s were a time of extensive change in the world of computers, the true power of these devices were just beginning to be discovered. Will these machines of our own creation, with their hypnotizing blinking lights and magnetic tape drives, indeed take over the world? The world may never know-unless, perhaps, you are Bill Gates.