Prediction and predilection: Creative Computing and the future of the micro industry. John J. Anderson.
Let's cut right to the heart of the matter: the real problem with choosing the occupation of prophet is the extremely strong possibility of ending up completely, hilariously wrong.
There are carloads of amusing examples, many from extremely learned sources. Dr. Dionysys Lardner, for example, professor in the early 19th century of natural philosophy and astronomy at London's University College, asserted that "rail travel at high speed will never be possible, because passengers, unable to breathe, would die of asphyxia." Simon Newcombe is best remembered for his 1901 prediction that "flight by machines heavier than air is impractical and insignificant, if not utterly impossible." Eighteen months later, the Wright Brothers caught the breeze at Kitty Hawk. So much for the supposed experts.
I wonder if the teacher who told the young Albert Einstein he would never amount to much ever said to himself, "Boy, did I screw up!" I sure hope he did.
The one point to make of these examples is this: if you are going to err in predicting the future, err on the optimistic side.
I'm a World's Fair buff, so as far as I'm concerned, one good example here will suffice. At the New york World's Fair of 1939, in the very throes of the Great Depression, visitors to the General Motors Futurama exhibit were awed by visions of a shining future: airships and flying cars and glass skyscrapers 1000 stories tall, cities under the sea, and ribbons of superhighway 50 lanes across. The date of this coming glory? Why 1969, of course. The world of the future, you see, was only 30 years distant. Just as it is easy for us to envision Scotty beaming us up in the year 2014. Wrong Again
So we see how many ways there are to go wrong when predicting the future. you may be too biased in the direction of your own interests. you may be overly optimistic. Or you may be overly pessimistic.
Of course it seems difficult to be overly pessimistic these days. One wing of futurists gaining credibility by the minute are a group we may for convenience sake lump together as "the doomsayers." You know them: "The end is nigh." They have a point--just think of all the highly effective means we now have at our disposal to eliminate the future entirely! Nuclear holocaust, overpopulation, and pollution, just to name a few. Predictions of Armageddon have kept prophets busy for the last couple of thousand years, but will put them right out of business if and when they come true--typically short-sighted management, if you ask me.
Then there is the appositive wing, that holds technology itself to blame for all our problems. I call this group the "Neo-Luddites." They advocate a philosophy something along the lines of "Let's return to the dark ages, before it's too late!" They would stop the clock by pulling on its hands, like Harold Lloyd in "Safety Last."
Though I have made my share of predictions, right and wrong, in this magazine and elsewhere, I have always tried to take an active hand in shaping the future, rather than just "reporting" on it. I've been burned at it, too. Just last month the Atari column ended with an open letter to Jack Tramiel, urging him to market a certain new computer. A week after we went to press, the computer was picked up by Commodore. C'est la guerre.
Merely to predict is to attempt tc place yourself outside the process--very misleading, actually. For by the very act of prediction, you are attempting to mold the future to your personal vision--in fact predicting what you want to happen. In acknowledging my own ability to act, I can reject the line of doomsayers and of Neo-Luddites. My own philosophy is more along the following lines: "Don't sit there and mope over Malthus. Get off your keester and do all you can to prove him wrong. Even though you may admit he is right."
Okay, enough philosophy. Assuming we don't shortly burn ourselves out in some sort of apocalyptic flashdance, which, when I am feeling strong, I can bring myself to assume, there are some relatively safe predictions we can make--at least within a highly defined sphere. I'm going to go ahead and make a few here, somewhat conservatively, subjectively, and very much on the bias. I shall address them specifically to the microcomputer industry, including hardware, software, micros as a hobby, and finally the part I believe (desire) Creative Computing to play in coming years.
That's it for the qualifiers, folks. Now I'll start climbing out on the limb. Notice I never work with a net. Bye-Bye Boom
As for the future of the industry, well, the first thing you should know is that the honeymoon is over. Passion sure ran high for a while--you could cut it with a knife. The love was unconditional, and computers were about to solve all of the world's problems. Why, folks who didn't have microcomputers were ashamed to admit it. They would say things like "I'm getting one next week," or "I bought an IBM, but it hasn't come in yet." Ah, those were the days--before the world wised up.
Now the boom is over. Not only is the industry faced with a sobered customer, it is faced by a customer with a positive hangover. Last night this poor fella was totally inebriated by the notion of the computer, but all that is left this morning is a headache, a ringing in the ears, and a vague crankiness whenever the word "diskette" is uttered.
Many of the so-called industry "experts" made a Futurama of the micro market. With straight faces they predicted sales of 90 million units in 1985 and other such drivel. Realities have dictated a different story. This has resulted in a very competitive, though stratified, marketplace. In response, the industry must mature.
One of my favorite analogies is the comparison of the microcomputer industry to the early movie industry. The days of Woz and Jobs--those were like the days of Mack Sennett and D. W. Griffith. Heck, even a decent two-reeler back then was big news. Things like pans, parallel-action cutting, dissolves, and fade-outs were being invented on the fly, and those who did the inventing more often than not had no idea of the significance of their acts.
Nowadays the big corporations have stepped in, and in a direct parallel to the fledgling motion picture industry, one by one all the little independents are being squeezed out. The last true visionaries and entrepreneurs of the micro industry will soon be dispossessed. And the unimaginative, lumbering moguls will have the game to themselves. The products will no longer be born of inspiration, but by formula.
And ultimately, the industry will lose its spark, but gain a new sophistication. Forgive me the partisanship, but I'm quite convinced that the movie metaphor can be extended one step further: the coming of the Macintosh will do to the microcomputer business what the first talkies did for the film business. The Lisa was "The Jazz Singer," you see.
How can the market mature? It must offer systems that do more. It must offer systems that cost less. It must offer systems that are easier to use. It must continually offer innovation, as opposed to parochialism. And it must stop underestimating the customer.
See how easy it is to be a prophet? I wonder what the dues are in the Clairvoyant's Local. Standard and Poor
Pardon my wrench, but the search for standardization is to my mind so much turkey too-toos. Has MS-DOS really done that much for the industry? Even the best MS-DOS programmers will tell you that MS-DOS is mediocre and that its main claim to fame was to aid the popularization of the relatively mediocre piece of hardware on which it was designed to run. Better to abandon a standard, if you ask me, than to converge around a lousy one.
And I don't think standards can ever be other than mediocre, because they are compromised by the very fact of their standardization. It is quite like TV catering to the eleven-year-old mind. Wonderful, if what you are trying to do is sell deodorant during commercials. Not so wonderful, if you are trying sell thinking between commercials. I know even eleven-year-olds who are insulted.
Lest you think I am making this piece into some sort of anti-IBM diatribe, let me set you straight. Some of my best friends own IBMs. Now that they have sent me a full-stroke keyboard for my PCjr, I may actually boot something up on it. Some people are so up on IBM--because of those three blue letters. Some people are so down on IBM--because of those three blue letters. I'm not anything because of those three blue letters. I'm down on mediocrity and up on excellence.
For you see, we are on the verge of choosing the next standard, since MS-DOS is about played out. We are coming to a crossroad, a watershed, a cusp, as it were.
My message is that we should not resist a little innovation--the heretical idea of trying something new. We must be willing to surmount our conformist urge to set arbitrary standards, in the hope of just maybe finding something better. It is a little like freedom of choice. Hey folks, this is America! Smell the Coffee
It's America, all right, but we have already been passed, and we aren't even smart enough to wake up and smell the coffee. The Japanese already have us beat, while we bicker over whether they have us beat or not, because they are better tuned in to the secrets we preach but don't practice. You know: long-term planning, a healthier view of the market, a true commitment to R&D, to education, to new technology, to quality, to perfection, to personal accountability on the assembly line. That sort of stuff.
Of course if we fall into the mood to get our act together, we can knock their socks off. We can prove Alan Kay wrong in his recent prediction that the first working Dynabook will come from Japan. We can start our own Sixth Generation project. We can make a comeback in robotics. We can lead the way into the future, instead of following Japan's lead into the future.
And think of it: all we have to do is wake up and smell the coffee. Future Stock
It's easy to tell you what is going to happen to hardware in the future. We will develop extremely powerful 16-, 32- and 64-bit CPU chips, requiring very little voltage and capable of operating at very high speeds (10 Mhz+). Custom VLSI chips will get bigger and bigger, and the day is not far off when entire motherboards, consisting of one or more central processors and multiple support processors, will be surface-mounted in a "board" that is actually etched as a single chip. This technology will start out expensively, but eventually bring costs down even lower than they are today.
Concerning RAM memory: it will continue to become cheaper and less bulky. CMOS technology will make volatile RAM a remnant of yesteryear. It will take only a trickle of voltage to maintain memory between power-ups. Once RAM is non-volatile, ROM becomes an antique. And new means of RAM storage will blur the line between RAM and conventional means of mass memory storage.
Mass storage itself will also make interesting strides. Hard disk technology will continue to miniaturize, and a 20 meg 3.5" drive will be realized within a few years. The 5.25" floppy will go the way of the 8" floppy--dinosaur city. The 1.5" floppy will of course be waiting in the wings--capable of storing 1 meg per side.
Want to talk about displays? Okay. The full-screen color LCD (80 columns X 25 lines in text mode) will make its appearance within a couple of years. In addition, dramatic new means of achieving flat-screen color will debut. The CRT is a long way from being outmoded, however. It will remain with us, relatively unchanged, for the next decade or two.
Miniaturization will continue along its current trend. Portable computers will eventually become the hottest segment of the micro market, and 1000K machines with 12 Mhz 32-bit processors, full-screen displays, and internal printers will weigh less than 10 lbs and cost less than $1500. Need a year on this? Try 1987 on for size. By 1990 no computer bigger than the Apple Macintosh will be selling well. Software Soothsayer
It is much harder to be sure where the realm of software will be going as the '90s near. Much depends on the question of software standards, raised in a nutshell earlier on. The search is on for a new standard in multi-user operating systems. Big guns, including AT&T, are betting on Unix, and I have kidded more than once that if Unix becomes the next standard, I am leaving the industry to become a tuna fisherman. Unix makes me nauseous. It would be a fitting epitaph for the U.S. microcomputer industry if Unix were to follow MS-DOS as the next software standard.
So don't get me started on Unix. Allow me merely to say that it is an operating system that to my mind has long outlived its usefulness and is being kept alive only by respirator. I say pull the plug--software euthanasia. I'm sure we can dredge up something better than that diseased old fossil.
Without getting too hung up in buzzwords, i do think that ease of use, integration, and software that works intuitively, are buzzwords and phrases that will not fade with current fashion. They spell the direction of software for the future. And, I think the nested window/menu/mouse approach of the Lisa and Macintosh will become standards for future software to match. Hobby Hoarse
When Creative Computing first got started, nearly all its readers were either educators or what you might term "enthusiasts." Nowadays, it is difficult to pin a label on the typical reader--he is certainly not a hacker--just a user. He doesn't explore the computer itself to pass the time, but uses it to run something else--a game, or business simulation, or personal productivity package, or other application of interest.
The hacker is still there, of course; he is just harder to find. He wishes he had a more powerful programming language. He wishes he had a smarter computer. He wishes it had more memory and more color and more animation potential. He wishes for better telecommunications potential and inexpensive hard disk storage. The low cost machine that satisfies these conditions will earn a niche in the future hobbyist marketplace.
On the software front, the enthusiast has grown quite difficult to please. He is not about to buy the next PacMan clone that hits the entertainment shelf simply because it is there. He wants something unique, something innovative, something hot, involving, and that shows off the very best capabilities of his machine. Software of that description has been rare lately, and many entertainment houses are hurting as a result.
I don't believe that "the bottom has fallen out" of the entertainment software industry, as other analysts have posited. I think the deadwood has taken a heavy toll and the market is ripe for some quality. When it appears, the packages will move.
Certainly there was a "fad" aspect to the consumer entertainment comuter--a fad which by and large has passed. But the collective consciousness has been raised. The stage is set. Soon we shall see the next-generation magic machine. The Role of Creative Computing
I have been thinking that it would be nice to make some declarations of principle in this, the tenth anniversary issue of Creative Computing magazine. Surely we have changed, and we will continue to change with the times. But we will never abandon the basic tenets of the philosophy that David H. Ahl brought to this, his magazine: that using computers should be fun; that learning about computers, too, should be fun; that a magazine is needed that can make those enjoyable aspects obvious and accessible; that computer users should be supported with software, applications, tutorials, and reviews; that 100% computer literacy should always be our goal; that we shall always "call them as we see them;" that we will do our reporting with humor and intellect; provide timely coverage of new developments; in-depth reviews of the hardware and software that really matters; provoke our readers to think; provide both sides of controversial issues; never be biased toward one piece of hardware; and display an ongoing commitment to human creativity with computers. We are, after all, the magazine with the word "Creative" in its title.
I don't have to hope the next ten years prove me right on that score. I know that to be true.