Don's Pages and my Music

Thursday, October 14, 2010

Digital Research - History of Computing

      Home   About Us   CP/M    Software     DR   Contact Us 
          
Digital           Research

HISTORY

How the Computer Began

People Learn to Count

The first human being to record numbers in a storage medium may have been a Sumerian accountant somewhere in the lower Mesopotamian river valley about 3200 B.C. using the sexagesimal numbering system based on the numbers 6 and 10. The discovery of arithmetic brought the Sumerians tangible benefits including the ability to numerically model the products of their economy, and their commerce grew making Mesopotamia the crater of Western civilization.

The earliest known computing instrument is the ABACUS, which was invented by the Chinese and which has been in use for over 2,000 years. It consists of a frame in which parallel wires with beads are strung.  A user moves the beads on the wires to perform all primary arithmetical functions according to a complex set of rules which must be memorized and executed by the human user.

The term COMPUTER was first used to refer to a human who made calculations.

People Teach Machines to Count

The first automatic COMPUTING MACHINE was created by Blaise Pascal in 1642. Designed only to add numbers for tax information purposes, it worked by the user moving dials to activate a mechanical calculating engine.

In 1671, Gottfried Wilhelm von Leibniz invented a machine that was capable of both addition and multiplication, which was done by adding numbers and then shifting and adding again.  The stepped gear mechanism that Leibniz introduced in 1694 has been used up through the 20th century.

In 1812, Charles Babbage, working as a professor of mathematics at the university in Cambridge, England advanced the idea of an automatic mechanical calculating device. Babbage realized that the human function of calculating even the most complex mathematical problems could often be broken down into a series of much smaller routines that a machine could potentially solve.

The Difference Engine

Charles Babbage reasoned that by automating the process of calculating these component routines, complex problems could be solved with more speed, accuracy, and less effort by a machine than by people performing the same work.

By 1822, after working on his concept for 10 years, Babbage began to design an automatic mechanical calculating machine, which he called a "difference engine."By 1822, he had built a small functioning model for demonstration.

After obtaining financial assistance from the British government, Babbage started construction of a full-scale difference engine in 1823.  It was designed to be powered by steam and fully "automatic" in both calculating and printing output tables.  It was controlled by a fixed instruction program that executed only in precise linear sequence.  This difference engine, although of limited practicality, was a great conceptual advance.

The Analytical Engine

Babbage continued work on his difference engine for a full ten years, but in 1833 he lost interest because he had a "better idea."

His new idea was the construction of a general-purpose, fully programmable automatic mechanical counting computer.  Babbage called his machine an "analytical engine."  Babbage's design for this device specified a parallel decimal computer operating on numbers (words) of 50 decimal digits.This device had a storage capacity (memory) of 1,000 such numbers.

The analytical engine was to use punched cards, and was to operate "automatically" by steam power.  It would require only one attendant and it would perform arithmetic.  Babbage's analytical engine was never completed, but it was the first real model of the modern computer. However, the dream of a general purpose programmable computer then vanished for nearly 100 years.

Mechanical Calculators

After Babbage there was a loss of interest in "automatic counting computers" both commercially and academically, but interest revived in the machine prototypes that had been built by Leibniz and Pascal in the 1600s.  Even while Babbage was working on his "difference engine" during the 1820s, the first commercial products were being built using the Leibniz and Pascal concepts.

In 1820, Charles Xavier Thomas (Tomas of Colmar) developed the first commercial mechanical desktop calculator.  It was capable of performing addition, subtraction, multiplication, and division.

Around 1848, George Boole, an English mathematician conceived a basic set of postulates (rules) describing the true-false statements of logic, and stated these in algebraic terms, which became known as Boolean Algebra, or Digital Algebra, which could manipulate binary (0/1) input and output variables.

Boolean Algebraic Logic

Boolean Algebra which manipulated OR, AND, or NOT relationships would ultimately enable the analysis and design of practical digital computer systems, although it would take until 1938 before Boole's important discoveries would become the foundation for all modern computer logic circuits.

Electricity and the Telephone

In 1883, Thomas Edison discovered electronic conduction, which made electricity possible. In 1888, Heinrich Hertz discovered radio waves.  Electricity became the key to the coming revolution in electronics and communication.

The Westinghouse Manufacturing Company, founded by George Westinghouse, installed the first alternating-current (AC) electrical power system in 1886 and fought to establish it as the standard in lieu of the early direct current (DC) system established by General Electric (GE).  In 1896, these companies pooled their patents and agreed on the AC standard in America.

Alexander Graham Bell's two key patents on the telephone, taken out in 1876 and 1877, became the foundation of American Telephone and Telegraph Company (AT&T), which Bell organized in Massachusetts in 1877.  The telegraph was the first electrical telecommunications device.

Electrical Calculators

In 1890, various companies electrified the mechanical calculator and began to add storage features, the ability to manipulate stored results, and the capability of printing the results to paper.

The industrial revolution was in full gear during this period and its great achievements were built on the use of numbers.  Advances in practically all areas were bringing progress to commerce, transportation, manufacturing, and building.  These advances also brought complex new engineering challenges.

Humans performed these complex calculations too slowly, and the results of human calculations were often inaccurate.  Engineering and accounting needs created the desire for a machine that could rapidly perform many repetitive calculations with absolute accuracy.  Some calculations, such as determining the center of gravity in huge structures, took months for humans to perform.

Tabulating Machines

In 1890, Herman Hollerith and James Powers, working for the U.S. Census Bureau, developed counting machines that could read information on stacks of paper cards that had been punched with holes and then calculate results from this information without further human intervention.

Provided the cards had been punched correctly by the human operator on the punched card machine, information could be read with great accuracy into the counting machine.  This was especially important to the U. S. Census Bureau which was undertaking the largest tabulating project ever done at the time.

The punched cards themselves were durable and were used as permanent memory to store information with virtually no quantity limit.  Different data could be stored on different batches of punched cards and fed into the calculating machine as desired to perform calculations to solve different problems.

The Punched Card Era

These electromechanical devices used electricity to power gears and wheels, and were commonly called "adding machines."  The most advanced models soon featured automatic reading of stacks of punched cards from reader stations.

The most sophisticated machines from IBM and others could perform such oper- ations as addition, multiplication, and sorting, as well as feed out cards punched with results.By modern standards, the punched card machines were very slow, typically processing from 50 to 250 cards per minute, with each card holding up to 80 decimal numbers.  However, they enabled a method of input, output, and memory storage in a flexibe way and on a massive scale.

For more than 50 years following the first use of punched card calculating machines, they continued to be used to do almost all of the world's business computing and most of the world's engineering and scientific computing.

Electronic Digital Computers

The Digital Era

During the decade of the 1930s, John V. Atanasoff, working as a professor of Physics at Iowa State College created a simple vacuum-tube device that took computer concepts well beyond the existing relay switch devices.  In 1973, a U. S. patent for this was granted to the successors of John V. Atanasoff.

Between 1935 and 1940, the German scientist Konrad Zuse, working in Berlin was doing the most advanced research on using electric relays as ON/OFF controls that would act as a BINARY (0 and 1) counter mechanism.  In 1941, he built the first computer that used the binary (DIGITAL) process using electrical relays, and then built the first vacuum tube digital computer system.

Alan Turing first proposed his theoretical "Turing Machine" at Cambridge University in 1936.  Turing's concepts for an "intelligent" machine based on Boolean logic laid the foundation for what was to become modern computing.

Electrical Relay Computers

By the late 1930s punched card machine techniques had become so well-established and reliable that Howard Hathaway Aiken, in collaboration with engineers at IBM, undertook construction of a large automatic arithmetical device based on standard IBM electro- mechanical relays.  Aiken's machine, called the Harvard Mark I, handled 23-decimal-place numbers (words) and could perform all four arithmetic operations.  It is generally considered the first practical digital computer and became operational in 1944.

The Mark I was originally controlled from pre-punched paper tape without any provisions for reversal.The automatic "transfer of control" instructions could not be programmed.  Output was by card punch and electric typewriter.  It used IBM rotating counter wheels and for the first time, electromagnetic relays to do its computations which took 3 to 5 seconds for multiplication.   It was the world's first machine to operate with programmed instructions.

The ENIAC Arrives

Starting in 1942 efforts in the U. S. centered around the work being done by J. Presper Eckert, John W. Mauchly, and their associates at the Moore School of Electrical Engineering at the University of Pennsylvania. They started building a "high-speed" electronic computer to meet the needs of the U. S. Armed Forces.  Eckert and Mauchly called their machine an "Electrical Numerical Integrator And Calculator."  It became known simply as ENIAC.

The size of its numerical word was 10 decimal digits, and it could multiply two such numbers at the rate of 300 results per second, by finding the value of each product from a multiplication table stored in its memory. The ENIAC was about 1,000 times faster than the prior generation of relay computers.

Its primary application was to calculate ballistic tables for the military, but the urgency for this activity had decreased after WWII ended.

The ENIAC Computes

The ENIAC used 18,000 standard vacuum tubes, occupied 167.3 sq m (1,800 sq. ft.) of floor space, and consumed about 180,000 watts of electrical power.  It had punched card input and output and arithmetically had 1 multiplier, 1 divider-square rooter, and 20 adders using decimal "ring counters" acting as adders.  It used fast-access (0.0002 second) read-write register memory.

The executable instructions comprising the Operating System (OS) and an application program were held in the separate units of ENIAC, which were plugged together to form a route through the machine for the flow of computations.

These connections had to be separately rerouted for each different problem, as did resetting function tables and switches.  Thus, the operation of ENIAC was controlled by a hardware "wire-your-own" instruction technique which was inconvenient and inflexible, yet it proved a machine could be programmable.

The First Electronic Digital Computer

The ENIAC made its public debut in 1946.  It is generally considered to be the first successful "high-speed" electronic digital computer (EDC).  Eckert and Mauchly opened the Eckert-Mauchly Corporation in Philadelphia in 1946.  What was lacking was a means to flexibly control the machine's operation.

In 1945, intrigued by the ENIAC computer, the mathematician John Von Neumann began an analytical study of computation that proved that a computer could have a simple, fixed physical structure and yet be able to execute any kind of computation using programmed control without making changes in hardware.

Von Neumann contributed important concepts on how fast practical computers could be organized and built.  He developed the stored-program technique and discovered that a complete computing program had to be divided into separate routines which could be flexibly used by the overall computational process.

Making the New Hardware Work

John Von Neumann realized that significant benefits and flexibility could be achieved by writing the program instructions to allow dynamic modification while the program was running.  This enabled hardware to act "intelligent."

Von Neumann met these two needs by providing a special type of machine instruction called conditional control transfer--which permitted the program sequence to be interrupted and reinitiated at any point--and by storing all instruction programs together with data in the same memory unit, so that instructions could be arithmetically modified in the same way as data.

As a result of these and several other techniques, computing and programming became faster, more flexible, and more efficient.  The new instructions in subroutines performed far more computational work.Frequently used subroutines became reusable avoiding reprogramming for each new problem.

Programming with Logic

Programs could be kept intact in "libraries" and read into memory as needed from a secondary storage place such as punched cards or tapes.  The all-purpose computer memory became the assembly location for various program parts.  The first generation of modern programmed electronic computers to take advantage of these improvements appeared in 1947.  They used the first Random Access Memory (RAM), generally consisting of approximately 8,192 Bytes.

In early 1948, at Cambridge University in the United Kingdom, Alan Turing delivered the first truly programmable digital computer, which he defined as a "universal machine that can do any task that can be described in symbols."  Thus, Alan Turing was the first to realize that the computer could perfectly execute human LOGIC so long as that logic could be expressed to the computer in a language it could reliably interpret.Turing's concepts set the stage for the development of SOFTWARE that would embody human thinking.

The Computer Becomes Available

In 1947, Eckert-Mauchly's first major customer was the U. S. Census Bureau.  In 1948, Prudential Insurance was the first business customer to use ENIAC, which they utilized to calculate actuarial (life expectancy) tables.  Eckert and Mauchly greatly underestimated the complexity, costs, and time required to produce a commercial product.  In early 1950, Eckert-Mauchly was sold to its leading competitor Remington-Rand.  The ENIAC was produced up to 1955.

In 1952, Remington-Rand introduced their newest electronic digital computing machine, UNIVAC for Universal Automated Computer.  It made its star-studded public debut in November 1952 on the CBS television network.

With Walter Cronkite anchoring the CBS 1952 Presidential Election Returns, on nationwide broadcast television, UNIVAC was used to predict who would win the election and become the next President of the United States.

The Computer Debuts on Television

CBS fed the incoming Presidential election results into the UNIVAC which was using one of the first computer databases.  Early in the evening, UNIVAC issused its computational prediction that Eisenhower would win.  Conventional pundits overwhelmingly thought Truman would win and that the "computer made an error."  So, CBS withheld its predictions from the air, but as the night went on, Walter Cronkite announced UNIVAC was right and Eisenhower had won.

By 1950, these computers could process 1,000 Instructions Per Second.  For the first time a "thinking" machine was much faster than any human operator.  The first-generation stored program computers required considerable maintenance, attained perhaps 70%% to 80%% reliable operation, and were used for 8 to 12 years.  Typically, they were programmed directly in machine language, although by the mid 1950s progress had been made in several areas of advanced programming.  This first group of machines included EDSAC and UNIVAC.

The Digital Platform Emerges

     The principles of a digital computer were now firmly established.   The key elements are:

Arithmetic & Logic Unit

Storage Memory Units

Control Processing Unit

Input-Output Units

The arithmetic and logic unit is that part of the computer where data values are manipulated and calculations performed.  This section of the computer usually contains numerous registers and paths between these registers.  The arithmetic and logic unit, control unit, and primary memory constitute what is usually referred to as the Central Processing Unit (CPU).

Registers are collections of memory devices that can save particular values.   For example, when numbers are to be added, they must be present at a physical location in the computer where the addition is to take place. The registers then manipulate data placed on them by using SOFTWARE instructions.

The Age of the Big Mainframe

The Mainframe Era

IBM (International Business Machines Corp.) introduced its first true computer in 1951, five years after the ENIAC made its debut. Thomas Watson, Jr., son of IBM founder, Thomas Watson, Sr., led the push into computers at IBM.  He said "it was the beginning of the end for IBM if it did not get into the computer business and let go of obsolete tabulating machines."

The great brilliance of IBM under Thomas Watson, Sr. centered around its remarkable successes in selling and servicing equipment.  IBM's greatest asset was a sales force in business equipment that no other company could match.  The Remington-Rand UNIVAC was more technologically advanced than IBM's first computer, but that also became one of UNIVAC's key marketing problems.  The UNIVAC used magnetic tape for storage, while the IBM computers fit directly into IBM's large installed base of existing punched card storage equipment.  IBM's machines made it easy for the customer to transition to computers.

The Transistor Arrives

In 1953, IBM introduced its 650 series of computers and sold 1,000 that year through its sales force to new customers.  Thomas Watson, Sr. said "nothing happens until something is sold," and the IBM sales people went all out to break prior sales records.  The 650 was the first mass-produced computer and the first commercially successful computer.  By 1958, IBM was unquestionably the largest computer company and held a 75% market share in computing.

As early as 1948, AT&T's Bell Laboratories created a working model of a new technical innovation that would revolutionize computing.  The 3 physicists John Bardeen, Walter Houser Brattain, and William Bradford Shockley discovered a way to move all of the functions of thermionic vacuum tubes, magnetic amplifiers, rotating machinery, and capacitors onto a single piece of semiconductor material.  They called it a TRANSISTOR.  In 1956, they received the Nobel Prize for it, the same year it became commercially available.

Integrated Circuits Appear

One of the major innovations that made hardware development possible was the invention in 1959 of the INTEGRATED CIRCUIT by Jack Kilby working at Texas Instruments and by Gordon Moore and Robert Noyce at Fairchild Semiconductor.  Gordon E. Moore began working with William Shockley on transistors in 1956, and became one of 8 founders of Fairchild Semiconductor in 1957.  The integrated circuit eliminated the complex wiring nightmare of hooking large numbers of transistors together and made even faster performance possible.  In 1965, Moore predicted that the power of chips would double every 18 months. This concept was proven correct, and became known as Moore's Law.

Large-scale integration (LSI) techniques enabled miniaturization of computer logic circuitry and of component manufacture.  As early as the 1950s, it was realized that scaling down the size of electronic digital computer circuits and parts would increase speed and efficiency and improve performance.

Computers Become Practical

In 1960, lithography of conductive circuit boards began to to replace wires. Photolithography made it possible to build resistors and capacitors into the circuitry by photographic means as printed circuits.

In 1961, Bank of America became the first major bank to use computers with the implementation of its ERMA (Electronic Recording Method of Accounting) system by General Electric.  These were prosperous times in America, and GE touted its machines on television with spokesman Ronald Reagan.

Random Access Memory (RAM) capacities increased from 64,000 to 512,000 bytes in commercially available machines by the early 1960s, with access times of 2 or 3 milliseconds.Such computers were typically found in large computer centers operated by industry, government, and laboratories.  These computer centers had to be staffed with many programmers and support personnel.

Assembling The Binary Puzzle

Most business interest initially centered around processing numbers and information on customers, inventory, orders, and sales.  The key problem was a lack of SOFTWARE to make the new computers achieve the desired results.  Unlike hardware, software was entirely intellectual property, and specific software solutions had to be developed by people for every particular problem.  The development of software was not only extremely time consuming, but also very expensive, costing 200%% to 400%% of the price of the computer hardware.  It was also very hard to find PROGRAMMERS who could do the job.

Computer hardware could only understand BINARY information. This meant that all program instructions and data had to be written in, or translated to, a combination of only two possible values 0 and 1.  These values could then be processed as positive or negative electrical charges by the hardware; the computer by itself was just a pile of ON/OFF switches.

Computer Languages Emerge

Each type of computer had its own proprietary instruction set, which defined methods and binary sequences for performing its functions.  Since 0s and 1s are tedious and difficult to interpret, alphanumeric strings were developed as aliases for each instruction.  These sequences became known as ASSEMBLY LANGUAGE, and the literal 0/1 sequences became known as MACHINE LANGUAGE.  Yet, assembly language itself was very difficult to use, because it was very code-oriented and performed only LOW-LEVEL functions.

IBM realized that a HIGH-LEVEL language needed to be created that abstracted these instructions into more useful alphanumeric commands.  The commands in the new high-level language needed to be understandable to people, so that larger and more complex programs could be written with less effort. FORTRAN was developed by IBM in the 1950s as the first HIGH-LEVEL language.  It had the ability to quickly and accurately manipulate numerical sequences.

Computer Automation

Rear-Admiral Grace Hopper, at the U. S. Department of the Navy, set out to define a high-level language to enable business functions.  She directed the development efforts of a COmmon Business Oriented Language (COBOL).  It was the first computer language to be structured like the English language.  COBOL supported many common business tasks, and was the first computer language to use a COMPILER, which could automatically translate the language into assembly language.  The concept of AUTOMATION was coined.  A primary goal became moving business and scientific tasks onto computers.

During the early 1960s, major computer manufacturers began to offer a range of computer capabilities and costs, as well as various peripheral equipment.  Consoles and card feeders were designed for input, and a variety of output devices emerged such as printers, cathode-ray-tube (CRT) displays, plotters, and storage media including magnetic tapes and hard disks.

The IBM 360 Arrives

In 1961, U. S. President John Kennedy declared the "space race" on, and the goal was to be the first to land a man on the moon.  NASA used the integrated circuit to build the first Apollo space crafts, and the Pentagon used it extensively in rocket and missile development. The Integrated Circuit (IC), which in 1960 had cost $1,000 for a version with 10 transistors, would very quickly come down in cost and increase rapidly in power. Robert Noyce compared the importance of the IC and the photolithography process that made it possible to the printing press, as this new IC chip could be mass-produced.

IBM debuted its 360 series of mainframes comprised of 6 models in 1964.  The 360 design used integrated circuits and a revolutionary modular platform architecture.  IBM's 360 used radically different software that was suitable for all sizes of businesses.  The IBM 360-85 used in the Apollo 13 in 1969 enabled the U. S. to win the space race with the first man on the moon.

The Province of the Glass Room

The mainframe computer hardware cost from $100,000 to over $3,000,000, which was very expensive in the early 1960s.  Adjusted for inflation these amounts would be from $500,000 to $20,000,000 in 1999 dollars. These machines were usually placed in large glass rooms which had to be kept at a constant temperature, and required a highly skilled staff of over 10 people to operate.

Mainframes continued to grow more powerful and became capable of concurrently serving hundreds and even thousands of users.  By the mid 1970s, punched card entry had almost entirely been replaced by direct entry with keyboards and cathode-ray display screens. IBM replaced the 360 series with the 370.  Remington-Rand became Sperry Rand.  Other firms emerged including Honeywell, Control Data, Amdahl, Cray, Hitachi, Fujitsu, and Digital Equipment.  Computers continued to use unique hardware designs and operating systems and were sold as complete units to very large and experienced customers.

The Mainframe Spawns Minis

Mainframes usually ran large databases, such as maintaining all the accounts for a bank or insurance company's customers.  Technological advances enabled the computers to become increasingly smaller while at the same time becoming more powerful.  Very Large Scale Integration (VLSI) enabled significant circuitry advances, and magnetic disks vastly increased access speed.

A new class of computer emerged largely through the innovations of Digital Equipment Corporation and became known as the MINICOMPUTER.  These midsize machines could perform many of the functions of the mainframe, but cost considerably less and so could be used for "less important" tasks and "departmental" computing in companies.  The DEC PDP-8 started this movement in the mid 1960s and became very popular with DEC's introduction of the PDP-11.  IBM also introduced smaller computers such as the System 1, 2, 32, 34, 36, and 38, while continuing to make its mainframes much more powerful.

The Personal Computer Revolution

Birth of the Personal Computer

In 1968, Intel Corporation was founded by Gordon E. Moore, Andrew Grove, and Robert Noyce, among others, in Santa Clara, California to pursue very large scale transistor integration.  In 1971, Intel engineer E. Marcian "Ted" Hoff created the Intel 4004, a general-purpose information MICROPROCESSOR that integrated 2003 transistors onto a single "intelligent" chip of silicon.

The 4-bit 4004 was designed for the Busicom 1141-PF Calculator, but could do much more. It was the first MICROCOMPUTER, which simply is a VLSI (Very Large Scale Integration) computer.  Micro comes the Greek word for small.

The very first personal computer was designed in 1974 by Jonathan Titus of Blacksburg, Virginia.  It was called the Mark-8 and based on the new 8-bit Intel 8080.  Because Titus merely offered instructions on how to build what he referred to as a "minicomputer", the Mark-8 never made much of a mark.

The First PC

The first assembled personal computer was the Altair 8800 from Micro Instrumentation and Telemetry Systems (MITS) in Albuquerque, New Mexico in 1975.  It was based on Intel's 8080 microprocessor but had no keyboard or display just a group of blinking lights which flashed on and off in patterns.

In 1975, a company called Traf-O-Data was formed by William H. Gates III and Paul Allen to make Intel microcomputers to measure how many cars rolled over a rubber hose stretched across a road.  They realized that the Altair needed SOFTWARE to make it useful, and developed a version of the BASIC programming language for the Altair which provided disk and file support.

BASIC (Beginners All-purpose Symbolic Instruction Code) was developed by two professors, John Kemeny and Kenneth Kurtz, at Dartmouth in the mid 1960s as a computer LANGUAGE incorporating the basics of computer programming.

The First PC Operating System

The first useful Intel 8080-based microcomputer was the IMSAI 8080.  It was aimed at small business users and had a floppy disk drive and used the CP/M (Control Program for Microcomputers) Operating System (OS). CP/M was developed for the Intel 8080 by Dr. Gary Kildall while he was working at Intel.

CP/M was based on the IBM PL/1 programming language that ran on IBM's S/360 mainframes.  In 1975, Gary Kildall and his wife Dorothy Kildall founded Intergalactic Digital Research Inc. in Monterey, California to develop and market CP/M as the first microprocessor Operating System. CP/M required only about 4 KB of RAM and established the .COM and .CMD executives and an API (Application Program Interface) that created a virtual program "socket".

Tandy (Radio Shack) soon introduced a microcomputer named the TRS-80 using CP/M to control its Zilog Z-80 microprocessor which imitated Intel's 8080.  The TRS-80s became the first commercially popular microcomputers.

Digital Research and Microsoft

By 1976, Gary Kildall changed the name of his company to Digital Research, and Bill Gates and Paul Allen changed Traf-O-Data's name to Microsoft.  As Digital Research worked on developing its CP/M operating system, Microsoft focused on programming languages for computers built around Intel and Zilog processor chips running Digital Research's CP/M Operating System.

MOS Technology introduced its MC6501C and MC6502 CPU microprocessor chips in 1976.  Steve Jobs and Steve Wozniak in Palo Alto, California created the Apple I computer using the 6502 CPU.  In 1977, the Apple II debuted at the Homebrew Computing Club.  The MOS Technology 6502 made it into many computers, including the Apple, Commodore PET, and Atari.  The Motorola 6800, which is often confused with the MOS 6502, went into the RS-Tandy CoCo (Color Computer), and it was not until Steve Jobs created the Apple Lisa in 1981 that Apple started using Motorola 68000 CPUs.

Most application programs were being written for CP/M which ran only on the Intel and Zilog microprocessors, so Microsoft licensed CP/M from Digital Research and sold it along with a board that had a Zilog Z-80 CPU as an add-in product for the Apple I & II which could then run CP/M and Apple programs.

The IBM Personal Computer

Intel introduced the 8088/8086 16-bit microprocessor in 1979.  Tim Patterson at Seattle Computer Products created the first board with the 8086 CPU.  Instead of waiting for Digital Research to release its 16-bit version of CP/M (CP/M-86), Patterson wrote QDOS using Microsoft's Altair Disk BASIC.  It was mostly a copy of CP/M with a few new file and disk handling features.

IBM had decided to make its first computer based on the Intel microprocessor and in late 1980, IBM approached Microsoft with a request that Microsoft design a version of BASIC for its new IBM Personal Computer.  IBM needed an operating system for the new PC. They intended to use the Digital Research CP/M, but Gates and Allen bought the rights to Patterson's 86-DOS for about $50,000 and made a deal with IBM in November 1980 to provide it as the operating system along with Microsoft BASIC.  On August 12, 1981 IBM introduced its new IBM PC using Microsoft DOS 1.0 which IBM called PC-DOS.  IBM did not have an exclusive license with Microsoft, and Microsoft soon began marking its own version as MS-DOS, which it licensed to computer OEM (Original Equipment Manufacturers) such as Compaq, and many others.

The First Applications

The most popular programs, dBASE II and WordStar, ran only on CP/M systems.  Since DOS was a clone of CP/M, IBM's goal of making these programs runnable on its new PC was ensured.  Application programs drive the success of a computer, and the IBM PC could run most of the popular "killer applications."

The standalone PC model was widely adopted by business because almost everyone in an organization could benefit from this personal productivity tool thanks to the two "killer applications" of spreadsheets and word processing.  PCs replaced two machines, the calculator and the typewriter with one and set the stage for replacing the filing cabinet with databases.

PCs, however, did not replace traditional mainframe computers nor the newly emerging Minicomputers like the Digital Equipment VAX.  These Minicomputers were used for company departments, while Mainframes continued to run ran entire companies and enterprises.

Industry Standard Architecture

In 1982, IBM introduced a new version of its PC with a hard disk and named it the PC XT. Instead of using very slow floppy disks to boot (start) and run the PC, users could load DOS and programs permanently onto a hard disk.  The first XT models had 5 MB or 10 MB hard disks and 640 KB of RAM.

Intel was designing the 80286 successor to the 8086, and in 1984 IBM was the first to introduce an 80286 computer with its IBM PC AT.  This computer had the first 16-bit bus to interconnect its parts which established the system known as Industry Standard Architecture (ISA).  Its operating system was the new IBM/Microsoft DOS 3.0.  The PC AT became an instant success.

IBM, however, had not made an exclusive contract with either Intel or Microsoft for the key parts which made up the IBM PC the X86 microprocessor, and DOS (CP/M) operating system.  Compaq and others quickly offered clone PCs.

Novell Creates Networking

Novell was the first to realize that PCs could be connected with boards and a special operating system to enable one or more PCs to act as a SERVER to the connected CLIENT PCs.  In 1982, Ray Noorda established Novell in Provo, Utah to design and market PC "networking" software called Novell NetWare.

With the NetWare Network Operating System (NOS), PCs could be used to share files, printers, and other system peripherals among a group of users.  DOS, itself, provided no support for network connectivity.

IBM and Microsoft also developed network products, but Novell installations were the most popular, accounting for about 75% of all networking using PCs.  Many companies saw the possibilities of entirely replacing minicomputers and even mainframes with networks of personal computers for small business and departments in big companies.  The PC was already becoming a group computer.

The Graphical User Interface

The concept of using a visual interface originated in the mid 1970s at the Xerox Palo Alto Research Center (PARC) where a graphical interface was developed for the Xerox Star computer system introduced in April 1981.

The Xerox Star did not experience any commercial success, but its ideas were copied by Apple Computer, first in the innovative Lisa in 1983 and then in the Apple Macintosh introduced in January 1984.  The Mac used the Motorola 68000 32-bit CPU running its own proprietary operating system.  The primary new application that made the Mac popular was graphical desktop publishing.

Microsoft also wanted to offer a graphical interface for the IBM PC, and in November 1983 it originally announced its Windows Presentation Manager program which was then released two years later in November 1985, the same year that Digital Research introduced GEM (Graphics Environment Manager).

In 1984, the X Windows graphical interface system also emerged as the first operating system independent graphical interface.  Unlike the Apple Mac GUI, DR GEM, or Microsoft Windows, X Windows can run as part of any computer.  It was created by MIT (Massachusetts Institute of Technology) and the X Window Consortium as an open systems environment.  It is popular on UNIX systems, such as SCO (The Santa Cruz Operation Inc.) OpenServer and UnixWare.

IBM is also working on graphical interfaces with its OS/2 Workplace Shell, which replaced the MS-Windows based Presentation Manager in the IBM OS/2 operating system, and which IBM says will be the common graphical interface across all of its computer systems from home PCs to the largest mainframes.

Microsoft Windows became very popular with the introduction of Windows 3.0 in May 1990 and Windows 3.1 in May 1992.  Windows 3.x includes full color, and is the most popular client GUI with over 100 million users by summer 1997.  The latest versions of Windows include Windows NT 3.51 Server & Workstation, and Windows NT 4.0 Server & Workstation.

The Future is Real-Time Connectivity

IMS REAL/32 natively runs MS-DOS and Windows 3.x and can easily connect with Windows NT 4.0 and the upcoming NT 5.0 Server as well as to NT and Windows 95 & the upcoming Windows 98 clients.  In 1997, IMS REAL/32 is the most advanced real-time Multiuser DOS & Windows OS available.   Variants of Windows 00 (which was formerly NT which was formerly OS/2 and now in 1999-2000 is Windows 2000) such as Citrix WinFrame are not real-time and typically run the same Windows 3.x programs that REAL/32 runs, but at much higher cost and with much less reliability and flexibility.  As of Service Pack 4 to NT 4.0, Microsoft has added Terminal Server capabilities to NT, and Citrix MetaFrame is still required to support the Citrix ICA protocol in addition to the proprietary Microsoft TSE protocol used to connect to standard PCs or Thin Client devices.

UNIX  is the other leading contender as an Intel PC Server OS with the key companies being SCO (The Santa Cruz Operation) which has pioneered Intel UNIX since 1979, and Sun which is the leading UNIX for the Internet with Sun OS and Solaris.  Linux from Red Hat and Caldera is an increasingly popular "non-branded" UNIX.  Connectivity among the MS-Windows Family (3.x, 95, 98, and 00 [formerly NT, OS/2]), REAL/32, Linux, Novell NetWare, and UNIX (SCO UnixWare and Sun Solaris) is transparent and easily accomplished today using industry-standard communications protocols such as Novell IPX and TCP/IP which is the enabling protocol for the Internet and the World Wide Web.

Chronology of Personal Computers - By Ken Polsson

Digital           Research
DigitalResearch.biz

Home | About Us | CP/M | Contact Us | Links | Privacy | Software | Yahoo! Group  ]

Copyright © 1974-2005 DigitalResearch.biz - MaxFrame Corporation


Go there...
http://www.digitalresearch.biz/HISTORY.HTM

Don

No comments:

Post a Comment