CHAPTER 2 - A Brief History of the Computer

computer - one that computes; specifically: a a programmable electronic device that can store, retrieve, and process data

What is this human propensity for technology all about? As a species, we seem to be in constant pursuit of ways to make our work proceed easier, faster and more efficiently. To what purpose? It doesn't seem to matter what the ends are, the means turn out to be processing information and transmitting it to others of our species, whether to our peers, or future generations.

Dictionaries published prior to World War II defined a "computer," as one who computes, or a person who calculates numbers. Using that definition, a history of computers begins with the abacus, which was first used by the Babylonians around 500 B.C.E. to help with simple arithmetic. After World War II, which was a critical growth period in the development of computers, the definition of a computer evolved, indicating that a computer could be a "machine," as opposed to a person. According to the ©2000 Merriam Webster Collegiate Dictionary, a computer is a programmable electronic device that can store, retrieve, and process data. Today's generation perceives a computer as a machine that we use to assist us in the processing of information. A "computer" began as "a person who calculates" but today, we define a computer as a "programmable electronic device."

Jacquard Loom

The Jacquard Loom is one of the world's first mechanical computers, which also makes it the first machine to produce computer graphics. Jacquard's 200-year-old loom met most of the criteria for a modern computer with the exception of how it derived its power. Looms have been around since the 1400s, or thereabouts, around the same time that Gutenberg was adapting the wine press to reproduced printed sheets. The loom was in use for about four centuries, when, in 1801 silk weaver, Joseph Marie Jacquard, sought to improve the complex task of keeping the threads in their proper positions during the intricate weaving process.

Jacquard solved the problem by encoding machine instructions and he data required to weave the patterns on the program cards, which had been used since the previous century. Jacquard's loom wove with precsion, speed, strength and dexterity that went well beyond the skills of a human silk weaver. Using the programmable loom, weavers were able to create more intricate patterns than ever before, and to precisely duplicate their work with digital accuracy.

While the loom was not "electronic," it was automated and relied on punched program cards to control the positions of the threads, enabling it to weave accurate and intricate patterns at "high speed." The program cards contained the information being fed into the machine; the newly-woven fabric was the output. Not everyone considers the loom a computer, but look at the wonderful "grid of pixels" that is weaved into the cloth. The very appearance of the "output" suggests to us today, that the design in the fabric must have been implemented with the help of a computer. No unassisted person would or could create so intricate a pattern "by hand," and even if they could, it would take too long and be cost-prohibitive. The "ouput" of a loom, a woven the fabric, doesn't necessarily make the loom a computer, but, it is not difficult too create a digital analogy beween a piece of woven fabric and the grid of pixels in a raster image. As far as I am concerned the loom was a computer, and Jacquard silks were the first computer graphics. The program cards are key binary messages encoded and fed into the computer to be massaged and manipulated into instructions for the various parts of the loom which serve as the output device. Congrats, Mr. Jacquard.

Charles Babagge's Analytical Engine

Charles Babagge, a prominent British mathematician and leading figure in London society knew Jacquard was onto something big when he had the opportunity to visit a silk-weaving plant that used the Jacquard loom. Babagge was self-motivated, and as a young boy, he read every algebra book he could get his hands on, waking regularly at 3 a.m. to study. Babagge attended Trinity College in Cambridge, but even such a prestigious school didn't have much to offer the brilliant, young man, for their materials and methods in the field of mathematics were antiquated, leaving them to rely solely on Newtonian mathematics. Nearly all of the advances in the fields of mathematics had been made, instead, by those who followed the calculus techniques of Gottfried Leibniz.

Trinity College, Sir Isaac Newton's alma mater, was still committed to Newton's method of calculus, leaving Babagge so frustrated that he decided to take matters into his own hands. While in school, Babagge formed the Analytical Society where he and his fellow students translated part of an up-to-date French mathematics text by Lacroix, and published several other books on the calculus of differentials.

After graduating from school, the 20-year-old Babagge worked on calculating a set of tables that were to be used by ocean-navigating vessels to determine the positions on the stars. His passion for precision left him frustrated with the inevitable errors made by clerks who worked on the calculations. "I wish to God these calculations could be done by a steam engine," Babagge complained. Not one to make idle comments, Babagge built a model of a calculating machine, beginning a lifelong quest to invent a way to automate solutions to general mathematical problems using a steam-driven calculating machine.

Babagge called the machine a Difference Engine, which was a plan for a machine that created mathematical tables, but he gave up the Difference Engine to pursue his dream of the Analytical Engine, a type of calucalor that was controlled by punch cards among many other technical features, securing Babbage's place in the history of computing. The plan for the Analytical Engine required more than 50,000 moving parts and was powered by a steam engine the size of a locomotive. Building a working Analytical Engine became Babagge's life's work.

In 1814, the well-to-do Babbage married Georgiana Whitmore, whose family were Shropshire landowners and politicians. Both Charles and Georgiana came from wealthy families, which enabled Babbage to independently pursue his professional interests. He received some money from the English government, but their support of the Difference Engine was less than whole-hearted, and Babbage never received enough to cover manufacturing and employee expenses. Work on the Difference Engine eventually fizzled out in 1834, but remains important as it was the precursor to the real computer, the Analytical Engine.

Charles' wife , Georgiana, died in the summer of 1827, and about a year later Charles moved his family and workshop into bigger and better digs. This was no modest home, accomodating over 200 English and European socialites and dignitaries, who were frequent guests at the Babbage Saturday evening soirŽes. It was at one of his parties in 1833 that Babagge made the acquantaince of Augusta Ada Bryon, also known as Ada Lovelace, the daughter of the poet, Lord Byron.

Lovelace, a writer and an amateur mathematician, collaborated with Babagge, chronicling his progress over the years. She corresponded with Babagge regularly, and writing her own ideas on paper, she sent them to him.

Moreover, Lovelace wrote professional papers on programming techniques. It wasn't unusual for her to raise money by making public presentations. Presenting to British nobility, Lovelace attracted potential patrons and then used their money to help Babagge continue constuction on his Analytical Engine

Due to the extent of Lovelace's contributions to the Analytical Engine, some regard her as the world's first software engineer, but others argue the honor of being the first goes to Babagge himself. Some believe Lovelace's role in history was more likened to that of Babagge's intern. They assert that Lovelace collaborated with Babagge. Babagge taught and mentored her in mathematics and in turn, Lovelace helped to promote and explain his theories to the public.1

We do know that during Lovelace's collaborations with Babbage, she accumulated over 7,000 pages of mathematical notes and diagrams, interpreting the new technology and presenting it to the British government in book form. Their mission was to help those members of government, society and finance understand the potential uses and impact of the Analytical Engine.

In Ada, the Enchantress of Numbers, Betty Alexandra Toole comments in her notes about Babbage's Analytical Engine: "The distinctive characteristic of the Analytical EngineÉis the introduction of the principle which Jacquard devised for regulating, by means of punched cards, the most complicated patterns in the fabrication of brocaded stuffs." Ada compared Jacquard's loom and Babbage's Analytical Engine, saying, "We may say most aptly that the Analytical Engine weaves algebraical patterns just as the Jacquard-loom weaves flowers and leaves."2

All was not well, however, with the lovely countess, who grew up with her mother after her father, Lord Byron, left his wife and young daughter to travel to America. Lovelace liked the racehorses and developed a gambling addiction, living most of her adult life in debt. In addition, she had a drug problem and died from uterine cancer in 1852 at the age of 36. She was outlived by Babbage, who died in 1871 without ever raising enough money to complete a working steam-driven Analytical Engine.

Babbage's eventual completion of a successful working Analytical Engine was not to happen in his lifetime. Despite the heroic efforts of Ada Lovelace, Babbage was never able to convince the British government to adequately fund the development and completion of the Analytical Engine. Little did anyone realize at the time, but somewhere in the rank and file of the underclass, young George Boole was cooking up a whole new kind of Algebra, which we know as Boolean Algebra.

Charles Babagge made meticulous notes during his lifetime and was a prolific writer, recording and publishing dozens of papers and articles between 1814 and 1868, three years before his death. Among his writings were detailed plans which survice today for building mathematical Calculating Engines. In 1991, 120 years after the death of Charles Babagge, Doron Swade, senior curator of computing for the Science Museum of London, built the "computer" laid out in Babagge's plans using only the materials and technology that would have been available to Babagge in the 1800s. Swade proved that the plans to construct the Analytical Engine would have succeded if Babagge had run out of money.

George Boole, Math and Technology

In 1832, binary algegra, now known as Boolean algebra, necessary for creating a binary computer, was developed by seventeen-year-old George Boole, another young British mathematician. Unlike Babbage, Boole's parents were poor, and young George, who was quite brilliant in all areas of study, worked from an early age as a schoolteacher to support his family. Boole was primarily self-taught, attending public schools, but pursuing most of his studies independently. He eventually became a member of the Royal Analytical Society, where he published papers and books on mathematics. Many of Boole's ideas were largely ignored until generations after his death.

Turn of the Century Developments

Herman Hollerith was an American inventor who devised a machine to read punched cards and make calculations represented by holes in the cards. Though not really a computer, the Hollerith Tabulator picked up on the punchcard technology, and was first used in 1890 to sort statistical information for the United States census, cutting the time necessary to tabulate data from a whopping 71Ú2 years to just six weeks. In 1896, Hollerith formed the Tabulating Machine Company, and after a number of mergers it eventually was renamed the International Business Machines Corporation (IBM) in 1924. In those days, IBM's main claim to fame were its punched cards and it was a leader in the punch card industry until the cards became obsolete in the late '60s Ð early '70s.

The first computers were pure mechanical devices, receiving no assistance from electricity. It wasn't until 1906 that the "electronic tube" (electronic valve) was developed by American. Lee DeForest, paving the way for electronic digital computers.

In 1938, German-born Konrad Zuse used Boolean Algebra to develop a prototype mechancial binary programmable calculator. It used old 35mm film for punched tape programs, a numeric keyboard to input data, and electric lamps to display the output. Zuse was assisted by Helmut Schreyer, who, in 1940 succeeded in constructing a 10-bit adder using vacuum tubes and neon lamps for "memory."

Alan turning and the Nazi Enigma

During World War II, British mathematician, Alan Turing, worked with the British government and was a key player in breaking the code of the German U-boat Enigma cipher. The codes were deciphered with the help of the first Programmable Electronic Computer, Colossus, which contained 2400 vacuum tubes. It was built in December by Dr. Thomas Flowers at the Post Office Research Laboratories in London.

As early as the 1930s, Turing envisioned a general purpose computer which he called the Universal Turing Machine. The Universal Turing Machine used specially encoded instructions, to perform uniquely specific tasks. Turing was a brilliant mathematician and visionary, predicting, "One day ladies will take their computers for walks in the park and tell each other, ÔMy little computer said such a funny thing this morning!'"

Turing was a pioneer is the field of artificial intelligence. Among his numerous accomplishments was a test he developed which would determine at what point a computer may be said to be thinking, and hence be intelligent. The test is known as, "The Turing Test."

The Turing Test presents the following scenario: Place an interrogrator in a closed room, and a computer and a human in a second room. The interrogator poses questions to both, and if the interrogator cannot tell from the answers which is human and which is machine, then the machine can be said to be "thinking."

Alan Turing was only 33 when the World War II ended. He continued his work and study in mathematics and computers with pioneering research in artificial intelligence, proposing that the computer could learn from and hence, modify its instructions. Turing's life's work was interrupted by an unfortunate turn of events which began when his home was burglarized in 1951. Turing was an open homosexual at a time when homosexuality was a felony in Britain. During the course of the robbery investigation in which Turing's home was burglarized, he amitted to having an affair with a man, suggesting that his partner was probably acquainted with the suspected culprit. Turing was summarily arrested, tried and convicted of "gross indecency." He refused to renounce his homosexuality and was subjected to forcible injections of female hormones in leiu of serving prison time. Publicly, he made light of the fact that he was growing breasts, but Turing was privately tortured and committed suicide in 1954.

Mauchley, Eckert, Von Neumann and the ENIAC

The fact that all the early developments in math and computers took place in England and Europe, shouldn't be too surprising. America was still a young country, and from its beginning, America always been involved in one war or another. While Alan Turing was working with the British government breaking Nazi codes, John Mauchley and J. Presper Eckert of the Moore School of Engineering in Philadelphia, Pennsylvania designed and built the Electronic Numerical Integrator and Computer, the first all electronic digital computer, which was completed in 1946 after the war had ended. They called their vacuum-tubed brainchild the ENIAC, and its success was due largely to mathemactician, John Von Neumann's logical program instructions which enabled the technology to work. After the war, Von Neumann modified the instructions, transforming the ENIAC into a programmable computer.

The fact that all the early developments in math and computers took place in England and Europe, shouldn't be too surprising. America was still a young country, and from its beginning, America always been involved in one war or another. While Alan Turing was working with the British government breaking Nazi codes, John Mauchley and J. Presper Eckert of the Moore School of Engineering in Philadelphia, Pennsylvania designed and built the Electronic Numerical Integrator and Computer, the first all electronic digital computer, which was completed in 1946 after the war had ended. They called their vacuum-tubed brainchild the ENIAC, and its success was due largely to mathemactician, John Von Neumann's logical program instructions which enabled the technology to work. After the war, Von Neumann modified the instructions, transforming the ENIAC into a programmable computer.

The 1946 ENIAC was the Rolls Royce of computers, running 1000 times faster than its competition. It could work with 20 numbers of 10 decimal digits each, an amazing feat for its time, but even that relatively miniscule effort required 20,000 swithcing units. The ENIAC weighed in at 30 tons and dissipated a whopping 150,000 watts of energy.3

Although it is widely held that the ENIAC was the first electronic digital computer, a 1973 United States district court ruling stated that Iowan, John Vincent Atanasoff invented the electronic digital computer. The court's decision was based on original work that Atanasoff did in the 1930s, and the influence his work later had on John Mauchly's design of ENIAC.ÉAtanasoff was the last of the lone inventors in the field of computation; after him, such projects were too complicated for anything less than a team effort.4

The first American programmable computer, very much like Babbage's Analytical Engine, in concept, became a reality in 1943, when Howard Aiken of Harvard University and IBM, completed the Mark I. Unlike Swade's version of Babagge's Analytical Engine, the Mark I was not steam-driven, but instead, this calculating machine was electromechanical and was a direct descendant of Babagge's Analytical Engine. The Mark I was 51 feet long, weighed 5 tons, and incorporated 750,000 parts. The American made Mark I preceded the British Colossus by just months.

Post WW II Ñ making connections in the MIT years

World War II was the catalyst that brought together great minds from a variety of disciplines. Instead of any one indivudual being responsible for new developments in the computer "industry," teams collaborated to advance the computer hardware and software industry.

Claude Shannon, a professor at MIT (Massachusettes Institute of Technology), was instrumental in establishing the mathematical foundation for information theory, which is a collection of theorems about information and communication, and which established information as a cosmic fundamental, along with energy and matter. Shannon's work was an integral link in the development of artificial intelligence and today's personal computers.5

Also from MIT, professor of mathematics, Norbert Wiener, and his assistant Julian Bigelow discovered the importance of feedback loops during their research into creating an automatic aiming mechanism for anti-aircraft artillery. The two men saw an important relationship between what they were attempting to accomplish and the way that human beings are able to react to changing events. So they sought to first understand both the thought process and the mechanics involved in the decision to act and the action, itself.

Take, baseball, for example. What exactly occurs when the pitcher fires a 90-mile-per-hour fast ball? The batter visually calculates the trajectory and speed of the ball, anticipating just the right time and angle to swing the bat. Learn how a person does that, and then convert that knowledge to its mathematical equivalent. Wiener and Bigelow thought that perhaps someone in the field of neurophysiology would be able to help them solve their problem. Wiener's work led to the eventual development of field of cybernetics, which brings together such seemingly diverse disciplines as logic, statistics, communication engineering, neurophysiology and mathematics. Scientists began studying the way the human brain works and used their findings to assist in the mechanization process.

In the 1950s, MIT's Claude Shannon set his talents and skills to build an intelligence, and, teamed with assistants Marvin Minsky and John McCarthy, the trio engaged in the study of what was to become the field of Artificial Intelligence.

Another MIT researcher and professor, Dr. J.C.R. Licklider was undoubtedly one of the most prescient and influential minds in the annals of computing. Upon meeting him, he asked that you please call him "Lick." His specialty was psychoacoustics, and he became consumed with the study of computers in much the same manner as most of his predescessors or contemporaries. He and almost everyone who was important in the history of the computer, were attempting to perform calculations and complex work, and they searched for ways to may their work quicker, easier, and error-free. Lick was a meticulous record-keeper, keeping track of the time he spent on each and every task throughout the day during the course of his psychoacoustics research. Because he was building complex mathematical and electronic models from the information he collected, he sought out some sort of mechanical servant to perform calculations and clerical duties.

During 1957Ð1958, Licklider joined Bolt, Beranek & Newman (BB&N), a consulting firm that owned a PDP-1, the first machine made by Digital Equipment Corporation. Lick's colleagues arranged for him to work with the PDP-1 directly. Typically, during the 50s, if you wanted to run a computer program, you'd have to hand off the program and data to an operator, and wait several hours, or even days for the results. The PDP-1 was one of the first of the "mini-computers," and its advantage was that it allowed the "user" to interact with it directly, modifying the paper tape input while the program was running. The PDP-1 was the first interactive computer. Describing how it felt to work with an interactive computer, Licklider said, "I guess you could say I had a kind of religious conversion." Licklider's "religious conversion" led him to pursue the concept of "interactive" computing and he proposed a "man-computer symbiosis" concept.

The hope is that, in not too many years, human brains and computing machines will be coupled together very tightly, and that the resulting partnership will think as no human brain has ever thought and process data in a way not approached by the information-handling machines we know today.

In October of 1962 Licklider became the director of the Information Processing Techniques Office at the Pentagon. He had been successful in convincing members of the governement that the future of computing would be interactive with human operators inputting data via a keyboard and with a display screen interface, key factors signifying the beginning of the age of personal computing. While working with ARPA (Advanced Research Projects Agency), one of Licklider's goals was to allow for many programmers to use a computer simultaneously, a practice that came to be known as time-sharing.

It was the early 60s, America was pumped up and the space race was on. Integrated circuits would replace transistors, resulting in smaller computers. In his 1959 book, Libraries of the Future, Licklider revealed what has come to be known as the "rule of two": Continuing miniaturization of its most important components means that the cost effectiveness of computer hardware doubles every two years.

According to Robert Taylor, director of the Systems Research Center at Digital in Palo Alto, CA., "Lick's vision provided an extremely fruitful, long-term direction for computing research. He guided the initial research funding that was necessary to fulfill the early promises of the vision. And he laid the foundation for graduate education in the newly created field of computer science. All users of interactive computing and every company that employs computer people owe him a great debt."7

In his new position at the Pentagon, Licklider had access to abundant funding, and was responsible for providing support to research groups nationwide, both in academia and in the private sector, as well. One of the research groups was from MIT and was dubbed Project MAC (Multi-Access Computing Project). Project MAC consisted of the MIT Artificial Intelligence (AI) establishment of McCarthy, Minksy, Papert, Fredkin, and Weizenbaum.

Often, the way it worked back then, was that Licklider's research money would fund a project that was directed by Marvin Minsky, for example. The job of Minsky's team of programmers was to seek out and expose the weaknesses in the projects that others in Project Mac were working on. Minsky's group was getting paid to write code that would crash the system. The "building up, shooting down" process was necessary in order to write reliable software, however the result was often adversarial. The hackers took a sort of perverse comedic pleasure in doing their jobs and the resulting work environment was. They were the hot shot snot nosed kids who were getting their kicks out of crashing the good ol' boys' systems. So it should be understandable that at times, the men who were colleagues were in an often adversarial relationship.

The people who were attracted to the core of research groups funded by Licklider tended to be bright young men, anti-establishment, rebellious, and a little wild. They were much like any fanatic who can go days on end, focused on a specific task. Not too long ago we called that a "Type A" personality. Sometimes I think of it as the opposite of attention deficit disorder, where one can stay focused on a task until it's done, without resting or stopping for proper breaks, meals and balance of life. Called "hackers," the programmers were often long-haired, unkemp, workaholics, going for days with little sleep, living on caffeine and vending machine food. When the hackers weren't trying to crash systems, they were usually involved in some sort of programming one-upmanship. The hackers continued to push the envelope of programming. One student proved to his his advisor, Marvin Minsky, that he could create a chess program. He followed through and surprised everyone, including Minsky. The program won a resounding victory over the biggest critic of the AI field, Hubert Dreyfus. The granddaddy of all computer games was called Spacewar, which quickly spread to campus computer centers nationwide. Many of the MIT hackers would later become the software designers of the 1980s. A few years later, the early 70s would see another breed of hackers who "hacked into" the phone system in order to get free access to long distance lines. The phone hackers were called "phone phreaks," and their ranks included the likes of Steve Wozniak, co-founder of Apple Computer, Inc.

It's important to understand that in the early 60s, the people who used computers were mostly engineers, mathematicians, systems analysts and computer programmers. If you weren't working in the computer systems department of a large company, it would be very difficult to gain access to a computer at all, though you might be able to buy time and share a mainframe which was set up to accomodate multiple users. The practice that was known as time-sharing was another Licklider brainchild. Renting computer time was expensive, and the idea was to spend as little time on the mainframe as possible. Computers were nothing like the personal computers we have today. You never knew what the results of your job until the printout was delivered. In those days computers were not yet interactive, but that certainly was the goal of Licklider's research teams.

At the first official meeting on interactive graphics, Ivan Sutherland, a protŽgŽ of Claude Shannon, the father of information theory, was invited as a professional courtesy. While at the meeting, Sutherland surprised the group when he presented a program he had created for his doctoral thesis. The program was called Sketchpad, and it provided an innovative and ambitious new way of commanding the operations of the computer.8

Sketchpad allowed a computer operator to use the computer to create, very rapidly, sophisticated visual models on a display screen that resembled a television set. The visual patterns could be stored in the computer's memory like any other data, and could be manipulated by the computer's processor.ÉBut Sketchpad was much more than a tool for computers to translate abstractions into perceptually concrete forms. And it was a model for totally new ways of operating computer; by changing something on the display screen, it was possible, via Sketchpad, to change something in the computer's memory.

Sketchpad was well beyond the scope of work of anyone on the professional research team.

Meanwhile, Doug Englebart, a former naval radar technician, had his own ideas on how to transform the computer into a "mind amplifier."9

Although many of the details took decades to work out, the main elements of what he wanted to achieve came to him all at once: "When I first heard about computers, I understood, from my radar experience, that if these machines can show you information on punchcards and printouts on paper, they could write or draw that information on a screen. When I saw the connection between a cathode-ray screen, an information processor, and a medium for representing symbols to a person, it all tumbled together in about half an hour.

Englebart first pitched his idea of an interactive computer with a graphical interface to Hewlett-Packard, manufaturers of electronic instruments, who, in the 1960s had no plans to branch out into computers. Perhaps that turn of events was fortunate for us, because making his ideas reality would take lots of money. Englebart continued to search for support of his vision. The very person who was not only able to share Englebart's vision, but provide the proper funding and research environment as well was J.C.R. Licklider and ARPA. With is abundance of research money, Licklider provided the funding necessary to form the Augmentation Research Center (ARC) where Englebart and his colleagues would develop and test their ideas.

The result of Englebart's vision was the "mouse" and a "windows" interface, the true ancestor of today's personal computer interface, which has changed the way human beings structure and maniplualte information. According to Howard Rheingold, "It is almost shocking to realize that in 1968 it was a novel experience to see someone use a computer to put words on a screen, and in this era of widespread word processing, it is hard to imagine today that very few people were able to see in Doug's demonstration the vanguard of an industry." Human beings are designing the technology to augment and enhance the human mind, and as a result, each new generation approaches and thinks about information and the world around us in a slightly different way than the preceding generation. Englebart's vision was driven by the desire to work collaboratively, augmenting the human mind by sharing information and knowledge, creating computers that gave visual feedback allowing the user to structure information in a way so as to make new connections and share the information with others.

Today

The computer in the year 2000 is a product of the evolution of human ideas and our unique ability to implement them. The computer enables us to think faster and work faster, to think more efficiently and work more efficiently, and to express creativity and communication in an ever-changing manner.

The human species has always used technology to assist in communication. In the case of computer graphics, consider that just 30 short years ago I was printing out x's that barely formed the likeness of the Mona Lisa or Abraham Lincoln. There were no TV boxes with pioneer computer games like Pong. Pac Man didn't pervade our culture until the late 1970s, early 1980s. Amazing as it seems, in 1980, we could barely get computers to spit out mere letters and numbers on crude dot matrix printers. Few indviduals dreamt of the practical applications that resulted from printing crude graphics onto paper.

Today, individuals own their own computers, color printers, digital cameras and scanners. Most people are take computers, computer graphics and the internet for granted, and many can't remember a time without them.

What's next? Seems like these days all you have to do is imagine it. Then it's just a short leap til someone actually turns that fantasy into reality. Just last night I read with interest about a new advertising technology that lets you hold a magazine page up to a monitor outfitted with a tiny, inexpensive video camera and software. In seconds your computer seeks out a specific address on the internet where you can link to the advertisers page.

Not so amazing I guess, but how about this? Predictions are, in 15 years or so, silicon will be obsolete. The first computers, like the ENIAC used vacuum tubes as a means of transferring electrical signals, next came transistors, then integrated circuits. Silicon is a miniscule crystal that carries an electrical signal, but minsicule is a relative term. It's not gargantuan when compared to the size of a molecule.

Prepare for molecular electronics, or moletronics, and a time when nothing will ever be the same again. Chemist, Jim Tour of Rice University and Mark Reed, a physicist from Yale create molecules. Their newest molecules "exhibit semiconductive properties that give them the ability to hold a charge or behave like swithces or memory, meaning molecular electronics could replace the transistors, diodes, and conductors of conventional micorelectronic circuitry." 10

Molecular microchips, populated with transistors that can be produced cheaply in astronomical numbers, will compute faster, remember longer without needing to be refreshed, and consume power at a mere trickle. More immediately, moletronics will transcend the limitations of magnetic and optical storage technologies, providing memory systems so powerful, small, and inexpensive that the entire Internet could be cached on a single desktop. Or, as Tour's associate Thomas Mallouk puts it, "Imagine a computer that remembers every keystroke you've ever made, with more storage capacity than you could ever need."

The current research and development in the new field of moletronics makes all the more visionary J.C.R. Licklider's mid-60s remarks that "Nature is very much more hospitable to information processing than anybody had any idea about in the 1950s. We didn't realize that molecular biologists had provided an existence proof for a fantastically efficient, reliable, information processing mechanismÑthe molecular coding of the human genetic system. The informational equivalent of the world's entire fund of knowledge can be stored in less than a cubic centimeter of DNA, which tells us that we haven't begun to approach the physical limits of information processing technology."11

What might we do with a molecular computer? What if we could get the molecular computer to bind to the DNA, and the add a chemical to provide a link to specific areas of the brain? For humans to be able to communicate digitally and actually make the computer a part of ourselvesÉwell, the possiblities are endless, and the ethical and social ramifications are food for thought.

If that isn't wild enough for you, it is likely that most of us will witness quantum computers in our lifetimes. Quantum computers will be powered by "intelligent" living energy called nanobots, which for the sake of economy, will be self-replicating.12

I hear the music to The Twilight Zone playing in my head. There are as many theories about the future of humanity as there are futures of humanity. My own personal projections is that our computers of the future will make a direct connection to the sensory and information areas of the human brain. Those computer will cast a 360-degree projection of a holographic image that only we will be able to view. We will have the ability to download our vision to selected participants, perhaps a specific audience, community or, "for your eyes only."

Rheingold says ideas are like seeds, or viruses. "If they are in the air at the right time, they will infect exactly those people who are most susceptible to putting their lives in the idea's service." We are in the throes of the information age, a time when communication with our fellow human beings is at an all-time peak. Our ability to connect with people scattered across the globe grows and multiplies exponentially, allowing us to form new sub-cultures based on things other than geographical location. Each day it becomes easier to form relationships based on common interests because there are fewer and fewer boundariesÉall this comes from an effort to improve communication by the transmission of information.

1 Ada, the Enchantress of Numbers, Strawberry Press, 1992

2 Scientific Memoirs, Selections from The Transactions of Foreign Academies and Learned Societies and from Foreign Journals Edited by Richard Taylor, F.S.A., Vol III London: 1843, Article XXIX.

3 Fire in the Valley, The Making of the Personal Computer, Second Edition, Paul Freiberger & Michael Swaine, McGraw-Hill, p.9 2000

4 Tools for Thought, Howard Rheingold, p.78 The MIT Press, 2000

5 Ibid.

6 "Man-Computer Symbiosis," JCR Licklider, IRE Transaction on Human Factors in Electronics, vol. HFE-1, pages 4-11, March 1960.

7 Preface to a white paper issued by Digital Systems Research Center, Robert W. Taylor, August 7, 1990, "In Mermoriam: J.C.R. Licklider, 1915-1990."

8 Tools for Thought, Howard Rheingold, p. 149 The MIT Press, 2000

9 Ibid.

10 Ibid.

11 "Wired," Vol. 8, Issue 7, July 2000, p. 244

12 Ray Kurzweil, The Age of Spiritual Machines, Penguin Books, 1999

Ada Lovelace, George Boole, John von Neumann, Alan Turing and Presper Eckert were all in their early twenties or younger with they did their most important work." ÑHoward Reingold Tools for Thought

Moore's Law

Transistor die sizes are cut in half every twenty-four months, therefore both computing capacity (i.e., the number of transistors on a chip) and the speed of each transistor double every 24 months. This is the fifth paradigm since the inception of computationÑafter mechanical, electromechanical (i.e., relay based), vacuum tube, and discrete transistor technologyÑto provide accelerating returns to computation. "Computers are about one hundred million times more powerful for the same unit cost than they were a half century ago. If the automobile industry had made as much progress in the past fifty years, a car today would cost a hundredth of a cent and go faster than the speed of light." Ray Kurzweil The Age of Spiritual Machines

What is a TURING MACHINE?

In 1936 Alan Turing, a British Mathematician, came up with an idea for an imaginary machine that could carry out all kinds of computations on numbers and symbols. He believed that if you could write down a set of rules describing your computation his machine could faithfully carry it out. Turing's Machine is the cornerstone of the modern theory of computation and computability even though it was invented nine years before the creation of the first electronic digital computer. The Turing Machine consists of an Input/Output Tape, the Turing Machine itself, and a Rule List.