Computer History: Operating Systems

Photo by ThisIsEngineering on Pexels.com

Computer operating systems (OS) provide a set of functions needed and used by most applications, and provide the necessary linkages to control a computer’s hardware. On the first computers, without an operating system, each program would have to have drivers for your video card, memory card, and other peripherals. The evolution of the computer applications and their complexity led to the OS necessities.

Early computers lacked any form of operating system. The user had sole use of the machine; he would arrive at the machine armed with his program and data, often on punched paper tape. The program would be loaded into the machine, and the machine set to work, until the program stopped, or maybe more likely, crashed. Programs could generally be debugged via a front panel using switches and lights; it is said that Alan Turing was a master of this on the early Manchester Mark I machine. He drew the seminal concepts of operating systems from the Universal Turing Machine concept.

Later, machines came with libraries of support code which were linked to the user’s program to assist in operations such as input and output. This would become the genesis of the modern-day operating system. However, machines still ran a single job at a time; at Cambridge University in England the job queue was at one time a washing line from which tapes were hung with clothes pegs. The color of the pegs indicated the priority of the job.

As machines became more powerful, the time needed for a run of a program diminished and the time to hand off the equipment became very large by comparison. Accounting for and paying for machine usage went from checking the wall clock to using the computer to do the timing. Run queues went from being people waiting at the door to stacks of media waiting on a table to using the hardware of the machine such as switching which magnetic tape drive was online or stacking punch cards on top of the previous jobs cards in the reader. Operating the computer went from a task performed by the program developer to a job for full time dedicated machine operators. When commercially available computer centers found they had to deal with accidental or malicious tampering of the accounting information, equipment vendors were encouraged to enhance the properties of the runtime libraries to prevent misuse of the systems resources. Accounting practices were also expanded beyond recording CPU usage to also count pages printed, cards punched, cards read, disk storage used, and even operator action required by jobs such as changing magnetic tapes. Eventually, the runtime libraries became a program that was started before the first customer job, that read in the customer job, controlled its execution, cleaned up after it, recorded its usage, and immediately went on to process the next job. Jobs also evolved from being binary images produced by hand encoding to symbolic programs that were translated by the computer. An operating system, or “monitor” as it was sometimes called, permitted jobs to become multistep with the monitor running several programs in sequence to effect the translation and subsequent run of the user’s program.

The conceptual bridge between the precise description of an operating system and the colloquial definition is the tendency to bundle widely, or generally, used utilities and applications (such as text editors or file managers) with the basic OS for the sake of convenience; as OSes progressed, a larger selection of ‘second class’ OS software came to be included, such that now, an OS without a graphical user interface or various file viewers is often considered not to be a true or complete OS. To accommodate this evolution of the meaning most of what was the original “operating system” is now called the “kernel”, and OS has come to mean the complete package.

The broader categories of systems and application software are discussed in the computer software article.

The mainframe: Early operating systems were very diverse, with each vendor producing one or more operating systems specific to their particular hardware. Every operating system, even from the same vendor, could have radically different models of commands, operating procedures, and such facilities as debugging aids. Typically, each time the manufacturer brought out a new machine, there would be a new operating system. This state of affairs continued until the 1960s when IBM developed the System/360 series of machines which all used the same instruction architecture. Because there were enormous performance differences across the range, a single operating system could not be used and a family of operating systems were developed. (The problems encountered in the development of the OS/360 are legendary, and are described by Fred Brooks in The Mythical Man-Month – a book that has become a classic of software engineering).

OS/360 evolved to become successively MFT, MVT, SVS, MVS, MVS/XA, MVS/ESA, OS/390 and z/OS, that includes the UNIX kernel as well as a huge amount of new functions required by modern mission-critical applications running on the zSeries mainframes. It is worth mentioning, that IBM maintained full compatibility with the past, so that programs developed in the sixties can still run under z/OS with no change. Although z/OS runs UNIX applications, it is a proprietary OS, in opposition to an Open System.

Control Data Corporation developed the Scope operating system in the 1960s, for batch processing. In cooperation with the University of Minnesota, the KRONOS and later the NOS operating systems were developed during the 1970s, which supported simultaneous batch and timesharing use. Like many commercial timesharing systems, its interface was an extension of the Dartmouth BASIC operating systems, one of the pioneering efforts in timesharing and programming languages. In the late 1970s, Control Data and the University of Illinois developed the PLATO operating system, which used plasma planel displays and long-distance time sharing networks. Plato was remarkably innovative for its time, featuring real-time chat, and multi-user graphical games.

UNIVAC, the first commercial computer manufacturer, produced a series of EXEC operating systems. Like all early main-frame systems, this was a batch-oriented system that managed magnetic drums, disks, card readers and line printers. In the 1970s, UNIVAC produced the Real-Time Basic (RTB) system to support large-scale time sharing, also patterned after the Dartmouth BASIC system.

Digital Equipment Corporation developed many operating systems for its various computer lines, including the simple RT-11 system for its 16-bit PDP-11 class machines, the VMS system for the 32-bit VAX computer, and TOPS-10 and TOPS-20 time sharing systems for the 36-bit PDP-10 class systems. Prior to the widespread use of UNIX, TOPS-10 was a particularly popular system in universities, and in the early ARPANET community.

The UNIX operating system was developed at AT&T Bell Laboratories in the 1970s. Because it was essentially free in early editions, easily obtainable, and easily modified, it achieved wide acceptance. It also became a requirement within the Bell systems operating companies. Since it was written in a high level language, when that language was ported to a new machine architecture UNIX was also able to be ported. This portability permitted it to become the choice for a second generation of minicomputers and the first generation of workstations. By widespread use it exemplified the idea of an operating system that was conceptually the same across various hardware platforms. It was still owned by AT&T and that limited its use to those groups or corporations who could afford to license it.

Many early operating systems were collections of utilities to allow users to run software on their systems. There were some companies who were able to develop better systems, such as early Digital Equipment Corporation systems, but others never supported features that were useful on other hardware types.

In the late 1960s through the late 1970s, several hardware capabilities evolved that allowed similar or ported software to run on more than one system. Early systems had utilized Microprogramming to implement features on their systems in order to permit different underlying architecture to appear to be the same as others in a series. In fact most 360s after the 360/40 (except the 360/165 and 360/168) were microprogrammed implementations.

One system which evolved in this time frame was the Pick operating system. The Pick system was developed and sold by Microdata Corporation, and Dick Pick, who created the precursors of the system with an associate, Don Nelson. The system is an example of a system which started as a database application support program, graduated to system work, and still exists across a wide variety of systems supported on most UNIX systems as an addon database system.

Other packages such as Oracle are middleware and contain many of the features of operating systems, but are in fact large applications supported on many hardware platforms.

As hardware was packaged in ever larger amounts in small packages, first the bit slice level of integration in systems, and then entire systems came to be present on a single chip. This type of system in small 4 and 8 bit processors came to be known as microprocessors. Most were not microprogrammed, but were completely integrated general purpose processors.

Home computers: Although most smallest 8-bit home computers of the 1980s, such as the Commodore 64, the Amstrad CPC, ZX Spectrum series and others could use a “normal” disk-loading operating system, such as CP/M or GEOS they could generally work without one. In fact, most if not all of these computers shipped with a built-in BASIC interpreter on ROM, which also served as a crude operating system, allowing minimal file management operations (such as deletion, copying, etc.) to be performed and sometimes disk formatting, along of course with application loading and execution, which sometimes required a non-trivial command sequence, like with the Commodore 64.

The fact that the majority of these machines were bought for entertainment and educational purposes and were seldom used for more “serious” or business/science oriented applications, partly explains why a “true” operating system was not necessary.

Another reason is that they were usually single-task and single-user machines and shipped with minimal amounts of RAM, usually between 4 and 256 kilobytes, with 64 and 128 being common figures, and 8-bit processors, so an operating system’s overhead would likely compromise the performance of the machine without really being necessary.

Even the rare word processor and office suite applications were mostly self-contained programs which took over the machine completely, as also did videogames.

Finally, most of these machines didn’t even ship with a built-in flexible disk drive, which made using a disk-based OS impossible or a luxury option.

Since virtually all video game consoles and arcade cabinets designed and built after 1980 were true digital machines (unlike the analog PONG clones and derivatives), some of them carried a minimal form of BIOS or built-in game, such as the Colecovision, the Sega Master System and the SNK Neo Geo. There were however successful designs where a BIOS was not necessary, such as the Nintendo NES and its clones.

Modern day game consoles and videogames, starting from the PlayStation all have a minimal BIOS that also provides some interactive utilities such as memory card management, Audio or Video CD playback, copy prevention and sometimes carry libraries for developers to use etc. Few of these cases, however, would qualify as a “true” operating system.

The most notable exceptions are probably the Dreamcast game console which includes a minimal BIOS, like the PlayStation, but can load the Windows CE operating system from the game disk allowing easily porting of games from the PC world, and the Xbox game console, which is little more than a disguised Intel-based PC running a secret, modified version of Microsoft Windows in the background.

Furthermore, there are Linux versions that will run on a PlayStation or Xbox and maybe other game consoles as well, provided they have access to a large mass storage device and have a reasonable amount of RAM (the bare minimum for a GUI is around 512 kilobytes, as the case of the Commodore Amiga or early ATARI ST shows. GEOS however ran on a stock C64 which came with as little as 64 kilobytes).

Long before that, Sony had released a kind of development kit called the Net Yaroze for its first PlayStation platform, which provided a series of programming and developing tools to be used with a normal PC and a specially modified “Black PlayStation” that could be interfaced with a PC and download programs from it. These operations require in general a functional OS on both platforms involved.

In general, it can be said that videogame consoles and arcade coin operated machines used at most a built-in BIOS during the 1970s, 1980s and most of the 1990s, while from the PlayStation era and beyond they started getting more and more sophisticated, to the point of requiring a generic or custom-built OS for aiding in developing and expandibility.

The personal computer era: Apple, DOS and beyond

The development of microprocessors made inexpensive computing available for the small business and hobbyist, which in turn led to the widespread use of interchangeable hardware components using a common interconnection (such as the S-100, SS-50, Apple II, ISA, and PCI buses), and an increasing need for ‘standard’ operating systems to control them. The most important of the early OSes on these machines was Digital Research’s CP/M-80 for the 8080 / 8085 / Z-80 CPUs. It was based on several Digital Equipment Corporation operating systems, mostly for the PDP-11 architecture. MS-DOS (or PC-DOS when supplied by IBM) was based originally on CP/M-80. Each of these machines had a small boot program in ROM which loaded the OS itself from disk. The BIOS on the IBM-PC class machines was an extension of this idea and has accreted more features and functions in the 20 years since the first IBM-PC was introduced in 1981.

The decreasing cost of display equipment and processors made it practical to provide graphical user interfaces for many operating systems, such as the generic X Window System that is provided with many UNIX systems, or other graphical systems such as Microsoft Windows, the RadioShack Color Computer’s OS-9, Commodore’s AmigaOS, Level II, Apple’s Mac OS, or even IBM’s OS/2. The original GUI was developed at Xerox Palo Alto Research Center in the early ’70s (the Alto computer system) and imitated by many vendors.

Computer History…

So when was the first computer invented? There is no easy answer to this question because of all the different classifications of computers. Therefore, this document has been created with a listing of each of the first computers starting with the first automatic computing engines leading up to the computers of today. Keep in mind that early inventions such as the abacus, calculators, and tablet machines are not accounted for in this document.

First mechanical computer or automatic computing engine concept

In 1822, Charles Babbage purposed and began developing the Difference Engine, considered to be the first automatic computing engine that was capable of computing several sets of numbers and making a hard copies of the results. Unfortunately, because of funding he was never able to complete a full-scale functional version of this machine. In June of 1991, the London Science Museum completed the Difference Engine No 2 for the bicentennial year of Babbage’s birth and later completed the printing mechanism in 2000.

Analytical EngineLater, in 1837 Charles Babbage proposed the first general mechanical computer, the Analytical Engine. The Analytical Engine contained an Arithmetic Logic Unit (ALU), basic flow control, and integrated memory and is the first general-purpose computer concept. Unfortunately, because of funding issues this computer was also never built while Charles Babbage’s was alive. In 1910, Henry Babbage, Charles Babbage’s youngest son was able to complete a portion of this machine and was able to perform basic calculations.

First programmable computer

The Z1, originally created by Germany’s Konrad Zuse in his parents living room in 1936 to 1938 is considered to be the first electro-mechanical binary programmable (modern) computer and really the first functional computer.

The first electric programmable computer

The Colossus was the first electric programmable computer and was developed by Tommy Flowers and first demonstrated in December 1943. The Colossus was created to help the British code breakers read encrypted German messages.

The first digital computer

Short for Atanasoff-Berry Computer, the ABC started being developed by Professor John Vincent Atanasoff and graduate student Cliff Berry in 1937 and continued to be developed until 1942 at the Iowa State College (now Iowa State University). The ABC was an electrical computer that used vacuum tubes for digital computation including binary math and Boolean logic and had no CPU. On October 19, 1973, the US Federal Judge Earl R. Larson signed his decision that the ENIAC patent by Eckert and Mauchly was invalid and named Atanasoff the inventor of the electronic digital computer.

ENIACThe ENIAC was invented by J. Presper Eckert and John Mauchly at the University of Pennsylvania and began construction in 1943 and was not completed until 1946. It occupied about 1,800 square feet and used about 18,000 vacuum tubes, weighing almost 50 tons. Although the Judge ruled that the ABC computer was the first digital computer, many still consider the ENIAC to be the first digital computer because it was fully functional.

The first stored program computer

The early British computer known as the EDSAC is considered to be the first stored program electronic computer. The computer performed its first calculation on May 6, 1949 and was the computer that ran the first graphical computer game, nicknamed “Baby”.

The first computer company

The first computer company was the Electronic Controls Company and was founded in 1949 by J. Presper Eckert and John Mauchly, the same individuals who helped create the ENIAC computer. The company was later renamed to EMCC or Eckert-Mauchly Computer Corporation and released a series of mainframe computers under the UNIVAC name.

First stored program computer

First delivered to the United States Government in 1950, the UNIVAC 1101 or ERA 1101 is considered to be the first computer that was capable of storing and running a program from memory.

First commercial computer

In 1942, Konrad Zuse begin working on the Z4, which later became the first commercial computer after being sold to Eduard Stiefel a mathematician of the Swiss Federal Institute of Technology Zurich on July 12, 1950.

The first PC (IBM compatible) computer

On April 7, 1953 IBM publicly introduced the 701, its first electric computer and first mass produced computer. Later IBM introduced its first personal computer called the IBM PC in 1981. The computer was code named and still sometimes referred to as the Acorn and had a 8088 processor, 16 KB of memory, which was expandable to 256 and utilizing MS-DOS.

The first computer with RAM

MIT introduces the Whirlwind machine on March 8, 1955, a revolutionary computer that was the first digital computer with magnetic core RAM and real-time graphics.

TransistorsThe first transistor computer

The TX-O (Transistorized Experimental computer) is the first transistorized computer to be demonstrated at the Massachusetts Institute of Technology in 1956.

The first minicomputer

In 1960, Digital Equipment Corporation released its first of many PDP computers the PDP-1.

The first mass-market PC

In 1968, Hewlett Packard began marketing the first mass-marketed PC, the HP 9100A.

The first workstation

Although it was never sold, the first workstation is considered to be the Xerox Alto, introduced in 1974. The computer was revolutionary for its time and included a fully functional computer, display, and mouse. The computer operated like many computers today utilizing windows, menus and icons as an interface to its operating system.

The first microprocessor

Intel introduces the first microprocessor, the Intel 4004 on November 15, 1971.

The first personal computer

In 1975, Ed Roberts coined the term “personal computer” when he introduced the Altair 8800. Although the first personal computer is considered by many to be the Kenback-1, which was first introduced for $750 in 1971. The computer relied on a series of switches for inputting data and output data by turning on and off a series of lights.

The Micral is considered the be the first commercial non-assembly computer. The computer used the Intel 8008 processor and sold for $1,750 in 1973.

The first laptop or portable computer

IBM 5100The IBM 5100 is the first portable computer, which was released on September 1975. The computer weighed 55 pounds and had a five inch CRT display, tape drive, 1.9MHz PALM processor, and 64KB of RAM. In the picture to the right, is an ad of the IBM 5100 taken from a November 1975 issue of Scientific America.

The first truly portable computer or laptop is considered to be the Osborne I, which was released on April 1981 and developed by Adam Osborne. The Osborne I was developed by Adam Osborne and weighed 24.5 pounds, had a 5-inch display, 64 KB of memory, two 5 1/4″ floppy drives, ran the CP/M 2.2 operating system, included a modem, and cost US$179.

The IBM PC Division (PCD) later released the IBM portable in 1984, it’s first portable computer that weighed in at 30 pounds. Later in 1986, IBM PCD announced it’s first laptop computer, the PC Convertible, weighing 12 pounds. Finally, in 1994, IBM introduced the IBM ThinkPad 775CD, the first notebook with an integrated CD-ROM.

The first Apple computer

Steve Wozniak designed the first Apple known as the Apple I computer in 1976.

The first PC clone

The Compaq Portable is considered to be the first PC clone and was release in March 1983 by Compaq. The Compaq Portable was 100% compatible with IBM computers and was capable of running any software developed for IBM computers.

See the below other major computer companies first for other IBM compatible computers

The first multimedia computer

In 1992, Tandy Radio Shack becomes one of the first companies to release a computer based on the MPC standard with its introduction of the M2500 XL/2 and M4020 SX computers.

Other major computer company firsts

Below is a listing of some of the major computers companies first computers.

Compaq – In March 1983, Compaq released its first computer and the first 100% IBM compatible computer the “Compaq Portable.”
Dell – In 1985, Dell introduced its first computer, the “Turbo PC.”
Hewlett Packard – In 1966, Hewlett Packard released its first general computer, the “HP-2115.”
NEC – In 1958, NEC builds its first computer the “NEAC 1101.”
Toshiba – In 1954, Toshiba introduces its first computer, the “TAC” digital computer.

The Acorn BBC Micro Computer

The BBC Micro computer was launched in December of 1981 as part of the BBC’s Computer Literacy Project. The Computer Literacy Project was created by the BBC (British Broadcasting Corporation) to increase computer literacy and to encourage as wide a range of people as possible to gain hands-on experience with a microcomputer. The BBC Micro was very successful in the UK, selling over 1.5 million units, and was widely used in schools, with the large majority of schools having one.

As part of the project the BBC wanted to commission the development of a microcomputer that could accompany the TV programmes and literature. The BBC created a list of required specifications in early 1981, with companies including Acorn Computers, Sinclair Research, Tangerine Computer Systems, NewBury Laboratories and Dragon Data showing interest in the project. Acorn’s bid won in March 1981 with their Proton prototype, which was being developed as the successor to the Acorn Atom. While the BBC Micro was launched in December 1981, production problems meant that deliveries of the computer were delayed.

The BBC Microcomputer, or the ‘Beeb’, is based on the 6502A microprocessor, which ran at 2MHz, and has 32K of ROM. The Model A shipped with 16K RAM and cost £299. The Model B shipped with 32K RAM and cost £399. The Model B featured higher-resolution graphics due to the higher RAM. Both models used the same circuit board, therefore making it possible to upgrade a Model A to a Model B. The machine’s high cost was compensated for by its impressive expansion possibilities including disc drives, a second processor and network capabilities (Econet).

The BBC Micro used the BBC BASIC programming language, a version of the BASIC programming language. It was created mainly by Sophie Wilson for the BBC Micro. 

The BBC Micro is housed in a case which includes an internal power supply and a 64-key keyboard with 10 additional user-definable keys. On the back of the case there are ports for UHF out, video out, RGB, RS-423, cassette, analogue in and Econet.

“For me this was the machine that really got me into programming and microelectronics. The BBC Micro was developed by Acorn computers for the BBC who were embarking on an education programme for the UK called the “BBC Computer Literacy Project”. The BBC made it their mission to have at least one of these machines available in every school in the UK.

The ‘beeb’ as it quickly became known as was fantastic for connecting to external equipment. It featured an anlogue ‘joystick’ port, a digital ‘user’ port, a 1Mhz bus connection, a ‘tube’ connection and a plethora of other connections. So many infact the the back of the machine ran out of space and they had to create a cut-away bay underneath the machine to accommodate them. But it was due to its connectivity and expandability that I really took to the beeb and started designing peripherals and software.

It was not a cheap machine. The BBC Model B sold for £399 on the high street in 1983 which was relatively expensive compared with other available machines like the Commodore 64 which sold for around £229. Regardless of the difference in price, because it was backed by the BBC, the beeb sold very well with over 1 million units sold.”

Manufacturer: Acorn
Date: 1981

Birth of the Computer

Who invented the computer, which is part of everyday use?  To get to this point we need to start back at the time, where it all began!

The earliest form of counting device would be the Tally Stick, but the one we all remember from our childhood days would be the Abacus first used in Babylonia in 2400BC.

The Astrolabe and the Antikythera Analog Mechanical Computers, were used in Ancient Greece in 150-100BC to perform astronomical calculations.  The Panisphere in AD1000 by Abu Rayhan al-Biruni. The Equatorium by Abu Ishaq Ibrahim al–Zarqali in AD1015 and the Astronomical Clock Tower of Su Song during the Song Dynasty AD1090.

In 1617, John Napier Scottish mathematician and physicist invented the Napier Bones, a device similar in appearance to that of an abacus, which could perform multiplication and division calculations.  In the 1620’s the Slide Rule was invented, a device to allow multiplication and division, using the basis of distances and line intervals to create the answer.  The use of the slide rule, faded out with the invention of the Pocket Calculator.

Wilhelm Schickard, A German designed the Calculating Clock in 1623, but it was destroyed by fire during construction in 1624, and the clock was never rebuilt.

In 1642 Blaise Pascal invented the mechanical calculator, and duly named it; Pascal’s Pascaline.

The Stepped Reckoner was invented by Gottfried Wilhelm von Leibniz in 1672, and came about, whilst using Pascal’s Pascaline machine.

The Frenchman, Joseph Marie Jacquard invented the Powered Loom in 1801.  It used Punched Wooden Cards, which defined the weaves pattern.  These wooden cards in relation to today’s world of computers would be the equivalent of a software program.

The Arithmometer invented by Charles Xavier Thomas de Colmar, was to become the first mass produced mechanical calculator in the 1820’s.  For it could add, subtract, multiply and divide.

In 1837 Charles Babbage invented the first Mechanical Computer, using his Analytical Engine.  The device was never finished during his lifetime, and it was left to his son Henry, to complete the work in 1888 in a simplified form: The Mill.

Ada Lovelace daughter of the poet, Lord Byron, was analyst of the Babbage analytical engine, and went on to create the first computer program between 1842-1843.  For it was her vision that computers would be capable of performing more than basic arithmetic calculations.

Herman Hollerith invented a machine to read and record punched cards; the Tabulator, and Keypunch machines towards the end of the 1880’s, like the Hollerith Desk, used by the U.S. to carry out the 1890 census.  He went on to open a Tabulating Machine Company on the back of its success, which eventually became International Business Machines (IBM).

Alan Turing considered by many as the father of computer science.  For it was in 1936 he provided the concept of algorithm and computation with the Turing Machine, and the blueprint for the first electronic digital computer.

Just think, when you turn on your computer, your actually using a design based on the brain child of Alan Turing.

In 1947, one Howard Aiken had been commissioned by IBM to determine how many computers it would take to run the United States… His answer was six.  How wrong had he been, who would have believed most homes would have at least one computer, some sixty-six years later.

In 1936 the first computer was built by Kenrad Zuse the Z1, believed to be the first electro-mechanical binary programmable computer.

In November 1937 whilst working at Bell Labs George Stibitz invented the Model K relay based calculator, which used binary circuits to perform calculations.

John Atanasoff a Physics professor from Iowa, built the first electronic digital computer in 1937, assisted by graduate student Clifford Berry.  It hadn’t been constructed as a programmable machine, for its main purpose was to deal with linear equations.

Konrad Zuse who built the Z1 back in 1936, took his invention to the next stage in 1941, by building the first program controlled electromechanical computing machine; the Z3.

Thomas Flowers joined the Post Office Research Branch in 1930, where he became Head of Switching Research. During the 1930s Flowers pioneered large-scale digital electronics.  Then in 1943 he designed and constructed the British Computer; Colossus.

Harry Fensom joined Flowers’ inner circle of engineers at the Research Branch of the British Post Office in 1942. He participated in the construction of the code breaking machine, Colossus, and was responsible for keeping it in continuous operation at Bletchley Park.

It was the world’s first electronic programmable computer, consisting of a large number of vacuum tubes.  Even though it had its limits, when it came to programming, its main use was in breaking German wartime codes.

In 1939 development started on the Harvard Mark I, an Automatic Sequence Controlled Calculator.  In fact it was a general purpose electro-mechanical computer by Howard Aiken and Grace Hopper, and financed by IBM.  It came into use in May 1944.

The ENIAC Mark I computer was the brainchild of John Presper Eckert and John W Mauchly in 1946.  The architectural design required the rewiring of the plug board to change its programming.  It was capable of adding and subtracting five thousand times a second, and had the added ability to perform, multiplication, divide and square root calculations.  It weighed in at thirty tons, used two-hundred kilowatts of power, and contained eighteen thousand vacuum tubes, fifteen hundred relays, and hundreds of thousands resistors, capacitors and inductors.

The Small-Scale Experimental Machine, also known as Baby, was completed in 1948 at England’s; University of Manchester based upon the stored-program architecture.  On the 21st June 1948, it made its first successful run of a program, using 32-bit word length and a memory of 32 words. 

The Manchester Mark I, a more powerful machine was built to supersede the Baby with expanded size and power, using a magnetic drum for auxiliary storage.

Later that year, Cambridge University built the Electronic Delay Storage Automatic Calculator, which was fitted out with a built-in-program.

The Government requested Ferranti to build a commercial computer based on the design of the Manchester Mark I in October 1948.  The Ferranti Mark I, included enhancements making it more powerful and faster.  The first machine was rolled out in February 1951.

John Presper Eckert and John W Mauchly, who designed and built the ENIAC Mark I computer, updated their design in 1951, with the release of the UNIVAC, for use by the U.S, Census Bureau.  It used 5,200 vacuum tubes, and consumed some 125kw of power.  Storage was by way of serial-access mercury delay lines.

In the early 1950’s Sergei Sobolev and Nikolay Brusentsov two Soviet scientists designed the Setun, a ternary computer that operated on a base three numbering system, (-1,0,1) rather than the conventional binary numbering system.  The computer was used within the Soviet Union, but its life short lived, and the architecture was replaced with a binary system.

In 1952, IBM released their first Electronic Data Processing Machine; IBM701, and its first mainframe computer.  Then in 1954, the IBM704 came onto the market, using a magnetic core memory.  During 1955-1956, IBM developed the Fortran programming language, for use with the IBM704, which was released in 1957.

In 1954, IBM produced a smaller computer the IBM650, weighing in at 900kg and the power unit at 1350kg.  At the time of construct it had a drum memory unit which could hold 2,000 words, later increased to 4,000 words with a maximum of ten letters per word.  The IBM650 used; SOAP (Symbolic Optimal Assembly Program).

Microprogramming was invented by Maurice Wilkes in 1955.

Then in 1956 IBM created the disk storage unit; the IBM350RAMAC (Random Access Method of Accounting and Control).  It used fifty 24-inch metal disks, with one hundred tracks per side, capable of storing five megabytes.

John Presper Eckert and John W Mauchly, recognized the limitations of the ENIAC, even before construction had been completed in 1951.  They started researching the possibilities where programs and working data, could be stored in the same area, on the same disk, at the time this would have been considered rather radical.

Equipment of the mid-1950’s transmitted data by acoustic delay lines using liquid mercury or a wire.  It worked by sending acoustic pulses represent by a “1 or 0” causing the oscillator to re-send the pulse.  Other systems on the market at the time used cathode-ray tubes, storing and retrieving data on a phosphor screen.

The Magnetic Core Memory, where each core equals one bit, was created in 1954, replacing many forms of temporary storage, and would go on to dominate the market for many years to come.

The Bipolar Transistor of 1947 went on to replace vacuum tubes from 1955.  The early versions were the Germanium Point-Contact Transistors, consuming less power, but reliability was an issue.

The University of Manchester, built the first transistorized computer in 1953, and the updated version was running by 1955.  It used two-hundred transistors, thirteen-hundred solid-state diodes, with a power consumption of 150 watts.  Whereas the Harwell CADET, had no tubes it had a tendency to crash every ninety minutes, but by changing to a more reliable bipolar junction transistor, they found crash times were reduced.

Upon comparing vacuum tubes and transistors, the transistors had many advantages, being smaller in design, requiring less power, which gave off less heat.  Transistorized computers contained tens of thousands binary logic circuits in a compact space.

With the creation of transistorized electronics, we saw the Central Processing Unit, within would be the ALU (Arithmetic Logic Unit), which performed arithmetic and logic operations the first of many devices which would show enhanced improvements. 

In a sense new technology had opened the flood gates to improved parts for the computer, where once they would have taken up the space of a large room, technology had seen them reduced in size, capable of sitting upon a table.  One invention would be the Data-Disk Storage Unit, capable of storing tens of millions letters and digits, alongside removable data disk storage units.  Input/output, a means by which a computer exchanges information.

Telephone connections went on to provide sufficient speeds for early remote terminals like the Teletype or Telex machine.

Who would have believed, that these stand-alone computers, one day would be the basis for the Internet.

Jack Kilby and Robert Noyce, designed the Integrated Circuit (Microchip) in 1958 which led to the invention of the Microprocessor.  Then in 1964 IBM released its Solid Logic Technology modules in Hybrid Circuits.

Intel opened its doors in 1968, and in their early days produced the semi-conductor memory, and went on to create DRAM and EPROM.  Intel developed their Microprocessor in 1971, and crammed an entire computer on a single chip.

They produced the Intel 4004, the first microprocessor consisting of 2300 transistors and clocked at 108 KHz.  They followed up with the 8008 and 8080 models.

The 8080 was used in MITS Altair computer kit.  This machine attracted one Bill Gates a Harvard freshman to drop out of college and write programs for the computer.

Alan Shugart and IBM invented the Floppy Disk in 1971, and nicknamed it the “Floppy” based on its flexibility of use.

The idea of computers co-ordinating information between one another had been around for years, using telecommunication technology.  Then in 1973 Robert Metcalfe and Xerox created the Ethernet Computer Networking System.

Olivetti, a company more associated with typewriters, presented to the world, their first personal computer the P6060 in 1975.  It had a 32-character display, 80-column thermal printer, 48 Kbytes of RAM and used BASIC language, weighing in at 40 kg.

In 1991 Bill Gates and Microsoft, supplied the world with MS-DOS an operating system to run the computer.  That same year IBM released their home computer, and so the home computer revolution had started.

In 1983 Apple released their home computer with a graphical user interface.  In 1984 Apple Macintosh released a more affordable home computer with graphical user interface.

In 1985 Bill Gates and Microsoft released their new Operating System which would revolutionise the computer for decades to come; Microsoft Windows, which has been upgraded over the years.  We have now reached Windows 10.

With the 1990’s came E-mail and the World Wide Web … and computers and the Internet would change our world for ever.