From number crunching to creativity. Steve Hunka.
Using computers almost daily for 25 years provides a perspective not noly for recalling the joys and frustrations of computing, but also for speculating about why computing is so important to so many individuals, and collectively to a nation.
Early computation using mechanical sorter-counters and punched cards provided for increased efficiency but left little room for creativity. Frequently, more time had to be devoted to card jams and replacing torn cards than to solving problems represented by the data. In the days of unit record equipment, read errors meant warped cards or cards with misaligned holes. Unit record equipment programmed through patch-boards by connecting input and output points with wiring pins frequently demanded more manual dexterity than creative thought and offered little more than the simplest arithmetic computations. Twenty-five years ago, getting access to a computer with electronically stored program code was an exciting and awe inspiring event, because machines were large and had beautiful displays of blinking lights which announced each elementary operation. In 1952 the University of Illinois provided students and researchers access to such a computer through an ORDVAC-class of machine called Illiac I. Illiac I
For the user, Illiac I consisted of two main units approximately twelve feet long, eight feet high, and three feet deep. One such unit contained the cpu, the other a 25K drum. Together these units occupied a room about 30 feet square; the power supply was in a separate 10'x15' room. An electrostatic memory of 1024 x 40 bits was provided by 40 small CRTs. On each tube was displayed a raster of 1024 dots providing an electrostatic delay circuit for 1024 bits. Execution speeds were relatively fast, e.g., 90 microseconds for addition and 800 microseconds for division. A flat screen CRT with a 35mm camera was also available for plotting purposes.
Most input and output was by paper tape prepared on teletype equipment. Special equipment for rapid duplication of paper tape was available. Students soon learned the frustrations associated with trying to unravel a "bird's nest" when a large rool of tape was accidentally dropped. Only the most elementary software was available for basic arithmetic functions and I/O. A machine language-like program code was used with two instructions per word.
Operating Illiac I was a model of simplicity. The operator could bootstrap the system by placing a read instruction in the instruction register by simply using the capacity effect of touching an external pin connected through a glass panel to each of 40 bit positions and moving one of three switches on a control panel only six inches spuare. A small speaker interfaced to the sign bit of one word provided an audible signal during computations. Operators soon learned to recognize endless loops even while engrossed in reading the latest novels. Time was precious on Illiac I. Three or four aborted jobs brought a warning note from the director of the center. Operational programs had to be documented and include an equation based on the execution times of each machine operation used and arguments representing data parameters. Today, many personal computers have as much power as Illiac I.
Many of the algorithms which were used in the early days of computing had been developed years before computers were available. Mathematical procedures which surely were considered impractical when derviced, rapidly became basic components of a program librar. Of particular importance were algorithms for solutions of simultaneous equations, approximations to trignometric and other functions, and finding eigenvalues and eigenvectors. The work of many mathematicians rapidly became available as part of a user's repertoire of computational skills. In a small way, the intellect and thought processes of previous generations became alive again. In a sense, a small portion of someone else's could be cloned through the computer. A Cumulative Process
Perhaps one reason why the growth in computer hardware and software has been exponential is that each new system carries with it many successful ideas created by individuals in the past--computer Darwinism. The "computer tree," having in its main trunk such early computers as Mark I, Eniac, and Ferranti with numerous branches identifying other computer lins which have come and gone, graphically illustrates the richness and dynamic nature of the computer evolution.
Of course, not all extinct computing systems succumbed because of their design. The ability of the market place to absorbe some systems was simply inadequate. A good exmple of this phenomenon was the demise of the IBM 1500 system, which was designed primarily for computer-based instruction and placed on the market in about 1965. This system, forgotten today even by most IBM personnel, was based on an 1130 cpu. The system drove up to 32 monochrome terminals with graphic capabilities, variable character fonts, and light pen, as well as a 16mm static film projector and randomly accessible audio tape system. Some of its special features are still not available today in systems designed for instructional work. Such systems, when used with well designed and optimized instructional programs, in effect clone the art and science of an instructor. Clearly, computers are not restricted to perpetuating only the numerical components of man's intellect.
Today, instruction, music, art, the visual conceptualization of DNA and RNA, and yes, even fantasies can be passed on to others. Surely, with these capabilities the computer is not just another tool.
Some users of computers have their needs well satisfied by procedures defined by others. Ohters, who want to define their own procedures, find themselves forced to externalize their thoughts into program code, and making the program run correctly learn a great deal about the behavior of the process involved. Externalizing and examining one's thoughts about a problem is not a new procedure. Musical scores, the printed word, and algebraic notations are all exmples of attempts to externalized what we think about. With computers we can frequently test the adequacy of our thoughts, reformulate and refine them, and test them over and over again.
Of course, not everyone will want to develop his own ideas for the solution of evey task proposed for his computer. This would be a tedious and inefficient way to get things done. Nevertheless, each computer user in his own domain of expertise gains a wealth of understanding about a problem and the adequacies of his own thinking by seeking a computer solution.
Computers continue to be built in the likeness of man--cognitive man. Thus, computers should help us improve our thinking processes as we solve problems, including those problems associated with the design and manufacturer of more versatile machines.
Computing has come a long way since the days of Illiac I. Today, one can become creative much faster and easier than ever before. We can only hope that man's ability to think about and solve problems will increase at a rate commensurate with his most pressing problems.