The Personal Computer

1975 witnessed the birth of personal computers, that is, a computer that could be owned and, ostensibly, operated by an individual with an average income. The cover of Popular Electronics magazine announced a MITS Altair 'kit computer' that people would buy and build, much in the same way one might buy a model airplane kit or dollhouse and then proceed to spend hours, days, weeks or more, to build. When the Altair finally shipped, it had no operating system.

Consider that in 1975:

The Altair Cost $300.

It shipped with no assembly instructions.

I could pay 2 month's rent with $300 in 1975

Analog computer - a computer that represents data by measurable quantities, as voltages, rather than numbers

What kind of person would spend two month's rent on a kit of wires, bulbs, screws and circuit boards, that, didn't come with assembly instructions, and when complete, didn't even do anything. It had no monitor. It was just a box with some lights. But the lights wouldn't light up, because no one had written the program to make the lights light up.

Two months rent on the privilege of figuring out how hook up a box with a bunch of wires and bulbs that didn't light up, because no one's had been able to write any software to make the little bulbs light up.

Oh, I repeated myself, didn't I?

Digital computer - a computer that processes information using numerical digits expressed in a scale of notation to represent discretely all variables ocurring in a problem

Okay, let's say you shell out two month's rent money because you just want to see if you can put it together and do some cool trick with it, like solve a math problem. You decide to spend the two month's rent money. Moan. You put the machine together and it seems to be hooked up right. Now what? Don't forget, two months rent.

Better get your money's worth, so you write BASIC software so that the computer can do something. Does this sound like something you might do? Maybe if the rewards would justify the gamble. But what kind of person could have that certainty?

The Altair had a total memory (RAM) of 256-bytes. Do the math and you'll find out that's enough to store about 50 words. That 50 words worth of space had to contain the code for the entire operating system. What operating system? No one had ever written one for the Altair, and why would they? You wouldn't be able to do much of anything besides make a *bulb blink.* And what then? That's itþtwo months rent down the tubes. At least that's the way the average person might have looked at the situation, but the people involved in the personal computer revolution were anything but average.1

While walking through Harvard Square one day, [Honeywell employee Paul] Allen spotted the Popular Electronics cover that featured the Altair. Like many other computer enthusiasts, he realized at once that the Altair was a tremendous breakthrough. But he also saw it as something of personal interest. Allen ran to tell Bill [Gates} that he thought their big break had finally come. Bill agreed. "So we called this guy Ed Roberts,' said Gates. "We had a fairly aggressive posture. We said, "We have a BASIC. Do you want it?'' In 1975, Allen and Gates were pioneers in the industry practice of preannouncing products that they didn't have yet. In later years, this type of thing would come to be called "vaporware.'

Roberts was justly skeptical. He had heard from many programmers who claimed they could write software for his computer. He told Gates and Allen what he told everyone else: he would buy the first BASIC he saw actually running on an Altair.

Unlike the others, Gates and Allen followed through, and about six week later Allen flew to Albuquerque to show Roberts their BASIC. The demonstration was a success, even though their BASIC initially did little more than announce its presence. The Traf-O-Data company, newly renamed Micro-Soft (later changed to Microsoft) had made its first sale as a microcomputer software house.

At the time, Bill Gates was a college freshman, and still in his teens. Allen worked for Honeywell. The two were now faced with the challenge to write an operating system for a computer they had never before seen, and demonstrate it to it's 'inventor,' with no dry run.

When Apple Computer, Inc.'s Macintosh made its first appearance on people's desktops back in 1984, it had an odd, unassuming new look. Apple announced the pint-sized PC with an extraordinary, award-winning Super Bowl commercial advertisement that claimed, 'On January 24th, Apple Computer will introduce Macintosh. And you'll see why 1984 won't be like "1984.'

With a mind-numbing 128 kilobytes of RAM memory, no storage capacity and a (400K) removable floppy disk drive, Apple had planted the seed for a 'desktop publishin' and 'multimedia' revolution in the unfolding information age. Apple forged ahead with its Macintosh model, and by 1987 they had introduced a larger, 13-inch, color monitor with 8-bit video card, and a CPU that boasted 4Mb RAM and a larger hard disk.

As always, the key to Apple's success in the graphics market was its user-friendly, visual interface. If you feel lost when it comes to understanding how a computer works and how to you can make the best use of a computer, try to think about the computer in situations with which you are already familiar. That's what the creators have in mind each time they update the system software. The following pages will introduce you to the terminology and concepts you should understand in order to use the computer for design and graphics.

Pre-Macintosh-spring 1985

Equipment necessary to set type was a Compugraphic Editwriter 7500, plus expensive film strips for each font, Resin Coated paper, processor and chemistry for processing galleys of type not only took up quite a bit of room, but it was expensive equipmentþmaybe $20,000.

Line art and photographic images could be prepared on a process camera to the tune of another $10,000 or so. Better have an industrial or commercial space to house the darkroom along with the horse-sized camera.

Did I mention city permits for the processing of hazardous waste in the form of chemical processing solution and waste silver by-products? Add several thousand to an annual cost, and we can just about eliminate a small start-up.

Computer graphics in spring of 1985

kludge - In computer lingo, an unfriendly interface or design, or an ugly mass of wires and boards is called a 'kludge.'

Newspapers and small cottage industry start-ups were niche markets for the equipment some shortsightedly labeled by some as a toy. I wasn't discourage, though‹I was having too much fun. I eagerly donned my role as pioneer, and stood up to the jeers and guffaws of those already entrenched in the graphics and newspaper industry.

I quickly became a willing participant in the evolutionary process despite the occasional arrow that pioneers suffer. There were few resources for learning, forcing the early adopters to go it alone. The only paint program available for the computer shipped bundled with my Mac 512K‹it was aptly called MacPaint. The first sentence of the 32-page pictorial manual stated, "The best way to learn MacPaint is to explore the drawing tools and patterns on your own. This is a book of hints to guide you in your exploration."

Ready or not, the only documentation we got with the program was 32 pages of hints. It's not that I felt cheated, exactly, but there just wasn't much help to be found anywhere. This personal computer stuff was new to everyone and I was the first person I knew who bought into it. I guess I was as well-prepared as anyone. I had been a keypunch operator, computer operator, and I was just finishing up six years studying graphic design, publication design and photojournalism‹which included lots of hand-on typesetting and process camera operation. I felt confident I could teach myself this stuffþafter all I wasn't about to pour $14,000 down the drain and then run home with my tail between my legs.

In those early days I survived on MacWorld Magazine, which kept current with every commercial graphics program that existed (there weren't many), and a book called Zen and the Art of the Macintosh: Discoveries on the Path to Computer Enlightenment., written by Michael Green. The first page of Zen proudly announced, 'This book was written, edited, designed, illustrated, typeset, laid out, and pasted-up entirely on a Macintosh computer.

When I first ran computers in 1971 and watched those giant line printers stamp out graphic representations by feeding a stack of punchcards filled with holes, it was magic.

MacPaint was just incredible. With MacPaint and a Macintosh 512K computer I could do the same thing in a much more direct way. Instead of writing computer programs, I'd get to use tools much like the ones I'd just learned to use in school, only all the new tool were electronic, mere copycats of the real thing, but nonetheless perfectly suited for producing newsletters, manuals, books, brochures, logo design and so much more. MacPaint was pretty easy to learn, being a one-bit program running on a monochrome monitor. The entire MacPaint tool set consisted of four painting tools: a brush, pencil, paint can and spray can. There was a line drawing tool, and shape drawing tools consisted of a rectangle, ellipse, polygon and freehand. There was a type tool for setting type, two selection tools: a rectangle and freehand, and last but not least a hand tool for moving the image around in the window. MacPaint boasted 38 bitmapped patterns for filling, shading, tinting or whatever, three short menus for working with type, the file, edit and goodies menu.

 

©1986, Running Press. Reprinted with permission of the publisher.

Goodies consisted primarily of a 'mode' called FatBits, which enlarged the image size to about 1600% making it easier to edit the image‹pixel by literal pixel. MacPaint file resolution was limited to 72 dpi. The only way to get better resolution was to use a word processing program, or a drawing program. In 1985, that would have been MacDraw, which was a watered down version of the draw module in today's AppleWorks.

I guess I learned everything there was to know about MacPaint in a few weeks or so while I was learning how to use the computer and manage my files. Zen didn't come until maybe one year later, but that was all right, because I don't think I came out of hibernation until then anyway. During that time I learned everything there was to know about Microsoft Word, MacDraw and PageMaker, which made its debut in 1986.

Apple bundled a word processing program called MacWrite with the 512K, but MacWrite lacked the ability to create 'leader dots'þfill characters with a right tab, and that was an absolute must for any typesetter. That made me a very early adopter of Microsoft products. None of the word processing programs let you put graphics and type on the same horizontal line, so I started making newsletters in MacDraw using graphics that I created in the drawing program and images that I scanned using my 300 dpi scanner, called ThunderScan. The ThunderScan snapped onto the platen of Apple's 72 dpi dot matrix printer, the Imagewriter.

The platen on the Imagewriter was very much like that on a typewriter. Theoretically, any image that could be rolled into the mechanism could be scanned with a ThunderScan. It worked pretty well for a one-bit scanner.

©1986, Running Press. Reprinted with permission of the publisher.

The process was slick, at least that's what I thought at the time. No more sending out for type. No more sending out for halftones, and best of all, no more pasteup! There was a bit of a problem thoughþthe type really looked awful to anyone with a critical eye. The letter spacing and word spacing was crude, and what's more, you couldn't mix styles within the same paragraph. That meant that if you wanted to italicize or bold, even a single word, you'd have to place an opaque mask over the existing word, and then paste an italic or bold word over it. What a hassle! Soon, PageMaker hit the market and all was well with desktop publishing once again.

Aldus PageMaker absolutely made desktop publishing, period. With PageMaker you could do everything that was impossible in Microsoft Word and MacDraw, at least all the stuff we wanted to do then, like mix type styles within the same paragraph and place text and images anyplace on the page.

Today PageMaker is published by Adobe, who has also recently published a page layout and illustration program called InDesign, combining many of the best features from PageMaker and Adobe Illustrator. Next to PostScript, Adobe's biggest claim to fame came with the 1989 debut of the premiere image editing program, Photoshop. To this day, Photoshop has no serious competition. Adobe's illustration program, aptly called Illustrator, has always been given a run for the money by Macromedia's Freehand, but Illustrator still dominates as the pro's software of choice.

©1986, Running Press. Reprinted with permission of the publisher.

Quark XPress entered the page layout and design market in the early 1990s, and began slowly, at first, to erode the market dominated by Aldus PageMaker. Gradually, Quark made gains, and art departments jumped ship, switching to a company that was listening and responding to the needs of a burgeoning graphics market.

Aldus, however, was not paying attention to the cries of its high-end design, newspaper and magazine market, who were primarily Macintosh users, but instead, Aldus continued to concentrate its efforts in the business world. Major dumb move. Why? PCs running Windows dominated the business world, requiring Aldus to devote a majority of their resources porting Pagemaker to PCs.

At the same time, Aldus failed to address major Pagemaker bugs on the Macintosh platform, frustrating and angering large numbers of its established user base. Aldus' timing was off. Virtually no one in graphics used a PC because service bureaus who output film all used Macs. Windows was not a player in the graphics market until a couple of years down the road. It was during this period in the first half of the 1990s that Quark gained the upper hand with designers and art departments in the graphic arts world.

I have money invested in both Pagemaker and Quark and use both all of the time. To date, neither is the perfect program for my needs, so I go back and forth based on the nature of the job. Quark, however, is by far, the graphic artists' program of choice for a page layout program.

While the battle of the page layout programs ensued, the Internet, which was first used by education, science and government, was being primed to step into the communications limelight. Within just a few years, graphic artists would be rushing to learn how to design a new kind of page for a brand new graphics medium that we know as the World Wide Web.

Looking back over the course of history, it is apparent that as a species, we humans are unique in our technological prowess. We are constantly on the lookout for ways to cut our workload, while saving time and money. In order to satisfy our need for efficiency, speed, and unique forms of expression and communication, computers have crept into virtually all facets of our lives‹at work, at home in our automobiles, telephones, cash registers, coffee makers, ovens, traffic signals, art, design of all sorts, the graphic arts and the list goes on‹computers are everywhere. Much of the time when we use computers, we don't even realize it, but of course, when we use a computer to write a letter, we are well aware that we are using a computer and not just a supercharged typewriter.

What makes up a Personal Computer?

Computer workstations, like those that enable us to log on to the internet, write a letter, or perform a series of calculations, or create art and design can consist of a number of different components which fall into three primary categories: 1) Input devices, 2) The Central Processing Unit (CPU), and 3) Output devices.

Input Devices

In order to accomplish anything on a personal computer today, it is necessary to get information into the computer and transformed from an analog format and into a digital format, because computer have been digital since the 1946 ENIAC. While there are many ways to digitize information, the most obvious are through the keyboard and the mouse.

Because today's computers are digital, unlike their analog ancestors (the loom, Babbage's Analytical Engine and the Colossus), computer engineers are now creating input devices that are already digital, eliminating the need for complex analog to digital conversions.

If the input device is analog, then the computer must be equipped with the necessary a special capture board or other type of board and software to translate the analog signal to digital. Once in digital format, the personal computers are able to perform complex operations which are customized and determined by the particular software application chosen, fulfilling the prophecy of a general purpose computer.

The mouse controls the computer's cursor, which assumes a variety of guises during the course of a work session, depending on the working environment in which it finds itself. At the desktop, the cursor assumes the appearance of an arrow. As it is moved over text, it changes into an 'I-beam.' In Adobe Photoshop, it can assume the appearance of one of the many tools that it mimics. In addition, the mouse is used to call up menu items, which are pre-programmed instructions that are called upon in order to set them into action.

The keyboard is also an input device. We use it for typing, entering numeric information, and we use it to type shortcuts consisting of

special key combinations which call up menu items, and by using Apple's Easy Access on a Macintosh, the keyboard can be used in place of the mouse. Voice recognition technology, in which the computer converts human speech into data, continues to improve, reducing the need for keyboard input.

Central Processing Unit (CPU)

Once information is entered into the computer, the central processing unit (CPU) takes center stage and, if necessary, converts analog information into a digital format. The heart of the CPU, sometimes called the microprocessor, in a Macintosh Power PC computer is the G4, while the microprocessor on a windows compatible machine, may be Intel's Pentium III/IV.

The microprocessor's capability is determined by the amount of information it can handle simultaneously, the number of different instructions it can perform and how quickly it can perform each instruction. Many people rely solely on processor speed, for example 400 or 500 MHz, when comparing a computer's speed and performance, but the only test that counts is a bench test in which two computers are given the same task, starting at the same time. The computer that finishes first is faster, and processor speed alone does not give the information necessary to determine speed under actual computing conditions. Check bench tests conducted by MacWorld, MacWeek, PCWorld or Consumer Reports for scientific tests and accurate reports.

What's under the 'hood'

Open up a computer from the last half of the 1990s, and it is a relatively tidy set of circuit boards, chips, disks and wires. Underneath all the silicon and copper is a machine that processes information using simple, step-by-step logical instructions.

ROM

The core of the computer's operating system resides in the Read Only Memory (ROM). The ROM consists of a chip, or group of chips that are affixed to the motherboard. The ROM tells the microprocessor what to do when the power comes on, and is the key to the computer's visual interface. The ROM determines the unique appearance of the icons, the windows and menus.

Until 1985, Apple had a lock on the 'windows' GUI for commercial personal computers. Microsoft gained the privilege of using parts of Apple's graphical user interface (GUI) in 1985, when Apple granted Microsoft a partial license. Microsoft in turn, released Windows 1.0 later that year.

If the entire operating system were stored in the computer's ROM, minor system enhancements would require a firmware upgrade, and would end up being unnecessarily expensive. In order to avoid the hassle and expense of a firmware upgrade each time a bug is fixed or a system improvement is made, the remainder of the operating system is encoded in software files that comprise the majority of the contents of the System Folder.

When the computer's power is turned on, the ROM wakes up and kicks into action, searching for two essential files, the System and the Finder. When it locates them, the computer's monitor comes to life, displaying the familiar Happy Face and the message, 'Welcome to Macintosh.' If the ROM discovers any kind of problem with the system or finder, it will display an 'X' across the screen. See Troubleshooting - 2-1

Output devices

Monitor - the monitor is the most obvious output device and returns information to the user in the form of a visual display. One need only take the time to read and understand the display in order to receive the message being communicated by the computer.

It is the job of Computer Interface Designers, the team of software engineers and computer programmers who write the system software, to communicate the processed information in a way that is easily understood by the user:  > send a written or graphical message > word and / or picture.

As consumers, when we say that a computer is 'user-friendly,' we mean that it is a computer that is easy to understand and use. A program or computer that is easy to use is also called, 'intuitive.' What exactly does that mean?

Are we saying that the computer itself is intuitive, able to act without thinking. Of course not. A computer must 'think through' billions of bits of information before deciding upon each of the millions of instructions it performs to complete even a simple task like 'starting up.'

What we really mean when we say a computer is intuitive, we mean that the computer is designed with such elegance that the user is able to interact with the it intuitively, without having to think at all‹or certainly no more than necessary!

Therefore, we design our computers to become more intuitive with each new iteration of microprocessor, each generation of software and each evolution of energy supply.

A printer is among the most common of output devices. Once upon a time there was a common misconception that using a computer would reduce the need for printed materials, creating a 'paperless' office. In reality, we have probably done just the opposite, printing everything imaginable‹a picture of our computer desktop, files created by software applications, photographs, pages from the internet and much more.

Other output devices include but are not limited to: Video camera, Film recorder, High resolution imagesetter, Digital press, Vinyl sign cutters, Manufacturing equipment

The binary system and all that math

The binary system is a simplified counting system which represents information using only two digits, 0 and 1, instead of the ten digits used in the decimal system. Using a binary based, or Base 2, system makes the data comprehensible to an electronic machine which recognizes only two states, or positions, 'on,' or 'off.' Zero is interpreted as 'off,' and one as 'on.'

Computers are unable to recognize or manipulate analog information, which can also be described as a continuous stream of information. Digital information, by contrast, is divided into discrete 'chunks' each one, separate from the next.

By stringing together the two binary digits, or bits, 0 and 1, in various combinations, we are able to represent analog information in a digital format which can be recognized by the electronics used in computers.

A single bit is so small that it cannot convey even a single letter of the alphabet, which requires eight bits, otherwise known as a byte. The letter 'A,' for example, is represented by the string 01000001. The letter 'B,' by 01000010 and so on. By changing the sequence of bits, one can represent any of the 26 letters in the alphabet, as well as 256 possible gray level decimal values ranging from 0 through 255.

As information becomes more complex than individual letters or 256 gray levels, bits are arranged in arrays, and larger units of measurement are required to represent larger amounts of data.

We're accustomed to counting with the decimal system which uses 10 digits as its base, also known as Base 10.

The ten digits begin 0, 1, 2, 3, 4, 5, 6, 7, 8, 9. After running out of digits, we begin again, adding a second place, a 1, followed by 0, 1, 2, 3, 4, 5, 6, 7, 8, and 9. We continue by replacing the 1 in the second place with a two, followed by 3, 4, 5, 6, 7, 8, and 9 and continue in this fashion until we arrive a 99, after which we add a third place, bringing us to 100.

A formula is mathematical notation used to represent increasingly larger and more complex numbers. 102, is another way of stating 10 x 10, or 100. 103 is 10 x 10 x 10, or 1000, and so on. 1010 is another way to represent the number, 10,000,000,000 (10 billion). Representing numbers like, 1010, is know as exponential notation.

The binary system can only count to 1 and uses the two digits, 0 and 1 to count. After counting to one, it then adds a second place. For example 0, 1, 10, 11, 100, 110, 111, 1000, and so on.

Like the decimal system, the binary system uses exponential notation to calculate larger numbers, but instead of using the terms tens, hundreds, thousands, millions, etc. it expresses those multiples in bytes.

Binary counting is based on two digits, each one a bit. Eight bits, the number required to represent a character (or a gray level decimal value), is equal to one byte. 210, or 2 x 2 x 2 x 2 x 2 x 2 x 2 x 2 x 2 x 2 = 1024. We call this a kilobyte. Typically 'kilo' means 1000, but since 210 = 1024, and not 1000, we fudge a little and use a term with which we're already familiar.

The number of possible states in a Base 10 counting system exceeds the number of possible electrical positions dictated by the laws of nature. Successful communication with the computer depends on a common understanding between the giver and the receiver. A 'language' based on a simple 'yes,''no,' or 'on,' 'off,' logic can be communicated using Base 2 math.

Bit Acronym for 'Binary Digit.'

Exercise 4-1

How many ways can you represent the concept, '12?'

Make a list on a piece of paper showing every way you can think of to represent the number 12. For example, six can also be represented by '6' or 2 x 3. Show all the ways to represent the number 12, using visual images, numbers or representations.

This exercise is designed to help you look at several different solutions when attempt to solve a problem.

Data Storage

The place where the information, or data, is stored is referred to as media. There are many different types of media, the most familiar being the computer's hard disk, because most computers are sold with a hard disk already inside the case. Each type of media has its own unique qualities and storage capacity. At the time of this writing most computers are sold with hard disks that are measured in gigabytes.

Quantities of information are measured and take up space on the media using a method which regards the bit as the smallest unit of information. Many people mistakenly refer to the amount of storage space available as memory, but it really is just disk capacity.

The media

Let's get it over with once and for all‹right here‹right now. For all practical purposes, the Macintosh 512K computer was the first graphical interface computer that was embraced by the general public. A 128K model preceded the 512 by just over a year, and Apple's Lisa was just before the 128K, but the 128K was used mostly by the hobbyist and game playing audience.

When the 512K was introduced along with the LaserWriter in 1985‹that's when all hell broke loose in the world of computer graphics and desktop publishing. The revolutionary LaserWriter was a printer based on Adobe's new postscript programming language. The LaserWriter printed with a resolution of 300 dots per inch (dpi) and was touted as producing 'near photo typeset quality' type and images.

The fact is, in 1985, most people were barely aware of the Macintosh computer or its capabilities. I was incredibly excited at the possibilities. I was an early Macintosh aficionado‹brave, because I knew I'd have to learn it on my own, and proud because I shared the vision and backed it with a $14,000 financial commitment. I buckled inþthis was going to be a real roller coaster ride.

In 1985, personal computers offered very few storage options. The most convenient and cost effective method of storage were 3-1/2-inch 'floppy' disks which could hold a whopping 400 kilobytes of data. Hard disks for the Macintosh were available in 10- or 20-megabyte capacities.

Today's survivors are floppy disks that store about 1.4 megabytes. Hard Disks now top in at 50-gigabytes and up. Storage devices with removable media, include Iomega's Zip 100Mb & 250Mb, Jaz 1Gb & 2 Gb, magneto optical and magneto resistive technologies, CD-R, CD-RW and tape drives, usually used for backup because they are relatively slow. DVD drives arrived on the scene late in 2001.

RAM Memory

Random Access Memory, or RAM, is the amount of horsepower available to run programs and perform tasks. RAM temporarily stores information which is copied or cut from a document. When we cut or copy data, we say that it is 'on the clipboard,' which is a user-friendly analogy for this sort of nebulous electronic purgatory where data exists before it is committed to disk or deleted. Using the term, 'clipboard' helps to make the computer user-friendly, because all know how a clipboard can be used. The computer also has a 'scrapbook,' which is given it's distinguishing name to denote the scrapbook's permanence. We all know that if we want to keep snippets of memorabilia permanently, we put them in a scrapbook. It's the same way with the computer.

Because the clipboard stores only one cut or copied element at a time and is temporary or transient, anything on the clipboard is lost when the power is turned off, the computer restarts or if there is a system failure. The clipboard is part of the RAM memory. Unlike the clipboard, the scrapbook can store cut or copied elements limited only by the computer's disk storage capacity, because the scrapbook is stored on the computer's hard disk as a file in the system folder. System failures, power outages and restarts have no effect on the elements stored in the scrapbook, but if it's only on the clipboard, it's a gonner.

RAM resides on separate chips that plug into the computer's motherboard, and is measured using the same exponential notation used to describe data storage, which may explain why some people confuse the RAM memory and hard disk storage.

On startup, the operating instructions that are not encoded in the computer's Read Only Memory chips (ROMs), are found in the system and finder software files, and are automatically loaded into the computer's RAM memory. Loading the instructions into RAM and not on the ROMs, really speeds things up. Characteristically, RAM is much faster than ROM.

RAM is temporary, and anything that is stored in RAM is lost when the power is turned off, but, there are certain settings that we would prefer not to lose every time we restart the computer. In order to retain important information, there is a portion of RAM is set aside to be continuously run by the computer's battery, called Parameter RAM or PRAM. The PRAM stores the date, time, control panel and desktop preferences and printer settings.

Information Exchange

As stated earlier in this chapter, the computer's primary purpose is to assist humans to process information. The computer is a tool, like a hammer, or stove, or any other tool, albeit much more complex and sophisticated. But, like the Swiss Army Knife, the computer far exceeds the versatility of most tools, because the computer can be made to perform a wide variety of tasks, depending on the construction of the computer and the software application it is using. The personal computer and the computers inside our automobiles and home appliances are modern incarnations of the 1820s 'Analytical Engine,' envisioned by Charles Babbage.

Broadly speaking, a computer in the year 2000 serves its purpose by turning it on, putting information into it and then allowing the software assist in the processing of the information. The result is then fed back to the user by displaying it on the monitor or printing it for the user to manage.

Visual Display

Once upon a time back in 1984 when Apple Computer, Inc. debuted their signature Macintosh 128K computer, the visual display, crude by today's standards, was pretty darn revolutionary. The screen measured nine inches diagonally (with a 7.25" usable image area--on a good day) and its display consisted of a white background color and black type.

Other computer displays had a black screen, with white, red, orange or green type--all were pretty hard on the eyes. The Macintosh, on the other hand, borrowed the vision being explored by Xerox at their Palo Alto Research Center (PARC). Steve Jobs of Apple, Computer, Inc. took a tour of Xerox PARC in 1979, and was blown away by the graphical, 'user-friendly' interface. Xerox's computer displayed layers of 'paper-like' documents and programs that came to life by pointing and clicking a mechanical pointing device on the picture representation we know as an icon. The Xerox experimental computer, displayed type the same way people were used to reading, black type on a white page. That feature alone was a huge change from the past. Jobs wanted that interface and what Jobs wants, Jobs gets.

All the same, the Macintosh 512K was an all-in-one package with a monochrome monitor, able to display only black with no colors, and no grays except the simulated gray achieved by alternating black and white pixels. But then, it was only 1980, and who needed color, anyway?