Computer History: Operating Systems

Photo by ThisIsEngineering on Pexels.com

Computer operating systems (OS) provide a set of functions needed and used by most applications, and provide the necessary linkages to control a computer’s hardware. On the first computers, without an operating system, each program would have to have drivers for your video card, memory card, and other peripherals. The evolution of the computer applications and their complexity led to the OS necessities.

Early computers lacked any form of operating system. The user had sole use of the machine; he would arrive at the machine armed with his program and data, often on punched paper tape. The program would be loaded into the machine, and the machine set to work, until the program stopped, or maybe more likely, crashed. Programs could generally be debugged via a front panel using switches and lights; it is said that Alan Turing was a master of this on the early Manchester Mark I machine. He drew the seminal concepts of operating systems from the Universal Turing Machine concept.

Later, machines came with libraries of support code which were linked to the user’s program to assist in operations such as input and output. This would become the genesis of the modern-day operating system. However, machines still ran a single job at a time; at Cambridge University in England the job queue was at one time a washing line from which tapes were hung with clothes pegs. The color of the pegs indicated the priority of the job.

As machines became more powerful, the time needed for a run of a program diminished and the time to hand off the equipment became very large by comparison. Accounting for and paying for machine usage went from checking the wall clock to using the computer to do the timing. Run queues went from being people waiting at the door to stacks of media waiting on a table to using the hardware of the machine such as switching which magnetic tape drive was online or stacking punch cards on top of the previous jobs cards in the reader. Operating the computer went from a task performed by the program developer to a job for full time dedicated machine operators. When commercially available computer centers found they had to deal with accidental or malicious tampering of the accounting information, equipment vendors were encouraged to enhance the properties of the runtime libraries to prevent misuse of the systems resources. Accounting practices were also expanded beyond recording CPU usage to also count pages printed, cards punched, cards read, disk storage used, and even operator action required by jobs such as changing magnetic tapes. Eventually, the runtime libraries became a program that was started before the first customer job, that read in the customer job, controlled its execution, cleaned up after it, recorded its usage, and immediately went on to process the next job. Jobs also evolved from being binary images produced by hand encoding to symbolic programs that were translated by the computer. An operating system, or “monitor” as it was sometimes called, permitted jobs to become multistep with the monitor running several programs in sequence to effect the translation and subsequent run of the user’s program.

The conceptual bridge between the precise description of an operating system and the colloquial definition is the tendency to bundle widely, or generally, used utilities and applications (such as text editors or file managers) with the basic OS for the sake of convenience; as OSes progressed, a larger selection of ‘second class’ OS software came to be included, such that now, an OS without a graphical user interface or various file viewers is often considered not to be a true or complete OS. To accommodate this evolution of the meaning most of what was the original “operating system” is now called the “kernel”, and OS has come to mean the complete package.

The broader categories of systems and application software are discussed in the computer software article.

The mainframe: Early operating systems were very diverse, with each vendor producing one or more operating systems specific to their particular hardware. Every operating system, even from the same vendor, could have radically different models of commands, operating procedures, and such facilities as debugging aids. Typically, each time the manufacturer brought out a new machine, there would be a new operating system. This state of affairs continued until the 1960s when IBM developed the System/360 series of machines which all used the same instruction architecture. Because there were enormous performance differences across the range, a single operating system could not be used and a family of operating systems were developed. (The problems encountered in the development of the OS/360 are legendary, and are described by Fred Brooks in The Mythical Man-Month – a book that has become a classic of software engineering).

OS/360 evolved to become successively MFT, MVT, SVS, MVS, MVS/XA, MVS/ESA, OS/390 and z/OS, that includes the UNIX kernel as well as a huge amount of new functions required by modern mission-critical applications running on the zSeries mainframes. It is worth mentioning, that IBM maintained full compatibility with the past, so that programs developed in the sixties can still run under z/OS with no change. Although z/OS runs UNIX applications, it is a proprietary OS, in opposition to an Open System.

Control Data Corporation developed the Scope operating system in the 1960s, for batch processing. In cooperation with the University of Minnesota, the KRONOS and later the NOS operating systems were developed during the 1970s, which supported simultaneous batch and timesharing use. Like many commercial timesharing systems, its interface was an extension of the Dartmouth BASIC operating systems, one of the pioneering efforts in timesharing and programming languages. In the late 1970s, Control Data and the University of Illinois developed the PLATO operating system, which used plasma planel displays and long-distance time sharing networks. Plato was remarkably innovative for its time, featuring real-time chat, and multi-user graphical games.

UNIVAC, the first commercial computer manufacturer, produced a series of EXEC operating systems. Like all early main-frame systems, this was a batch-oriented system that managed magnetic drums, disks, card readers and line printers. In the 1970s, UNIVAC produced the Real-Time Basic (RTB) system to support large-scale time sharing, also patterned after the Dartmouth BASIC system.

Digital Equipment Corporation developed many operating systems for its various computer lines, including the simple RT-11 system for its 16-bit PDP-11 class machines, the VMS system for the 32-bit VAX computer, and TOPS-10 and TOPS-20 time sharing systems for the 36-bit PDP-10 class systems. Prior to the widespread use of UNIX, TOPS-10 was a particularly popular system in universities, and in the early ARPANET community.

The UNIX operating system was developed at AT&T Bell Laboratories in the 1970s. Because it was essentially free in early editions, easily obtainable, and easily modified, it achieved wide acceptance. It also became a requirement within the Bell systems operating companies. Since it was written in a high level language, when that language was ported to a new machine architecture UNIX was also able to be ported. This portability permitted it to become the choice for a second generation of minicomputers and the first generation of workstations. By widespread use it exemplified the idea of an operating system that was conceptually the same across various hardware platforms. It was still owned by AT&T and that limited its use to those groups or corporations who could afford to license it.

Many early operating systems were collections of utilities to allow users to run software on their systems. There were some companies who were able to develop better systems, such as early Digital Equipment Corporation systems, but others never supported features that were useful on other hardware types.

In the late 1960s through the late 1970s, several hardware capabilities evolved that allowed similar or ported software to run on more than one system. Early systems had utilized Microprogramming to implement features on their systems in order to permit different underlying architecture to appear to be the same as others in a series. In fact most 360s after the 360/40 (except the 360/165 and 360/168) were microprogrammed implementations.

One system which evolved in this time frame was the Pick operating system. The Pick system was developed and sold by Microdata Corporation, and Dick Pick, who created the precursors of the system with an associate, Don Nelson. The system is an example of a system which started as a database application support program, graduated to system work, and still exists across a wide variety of systems supported on most UNIX systems as an addon database system.

Other packages such as Oracle are middleware and contain many of the features of operating systems, but are in fact large applications supported on many hardware platforms.

As hardware was packaged in ever larger amounts in small packages, first the bit slice level of integration in systems, and then entire systems came to be present on a single chip. This type of system in small 4 and 8 bit processors came to be known as microprocessors. Most were not microprogrammed, but were completely integrated general purpose processors.

Home computers: Although most smallest 8-bit home computers of the 1980s, such as the Commodore 64, the Amstrad CPC, ZX Spectrum series and others could use a “normal” disk-loading operating system, such as CP/M or GEOS they could generally work without one. In fact, most if not all of these computers shipped with a built-in BASIC interpreter on ROM, which also served as a crude operating system, allowing minimal file management operations (such as deletion, copying, etc.) to be performed and sometimes disk formatting, along of course with application loading and execution, which sometimes required a non-trivial command sequence, like with the Commodore 64.

The fact that the majority of these machines were bought for entertainment and educational purposes and were seldom used for more “serious” or business/science oriented applications, partly explains why a “true” operating system was not necessary.

Another reason is that they were usually single-task and single-user machines and shipped with minimal amounts of RAM, usually between 4 and 256 kilobytes, with 64 and 128 being common figures, and 8-bit processors, so an operating system’s overhead would likely compromise the performance of the machine without really being necessary.

Even the rare word processor and office suite applications were mostly self-contained programs which took over the machine completely, as also did videogames.

Finally, most of these machines didn’t even ship with a built-in flexible disk drive, which made using a disk-based OS impossible or a luxury option.

Since virtually all video game consoles and arcade cabinets designed and built after 1980 were true digital machines (unlike the analog PONG clones and derivatives), some of them carried a minimal form of BIOS or built-in game, such as the Colecovision, the Sega Master System and the SNK Neo Geo. There were however successful designs where a BIOS was not necessary, such as the Nintendo NES and its clones.

Modern day game consoles and videogames, starting from the PlayStation all have a minimal BIOS that also provides some interactive utilities such as memory card management, Audio or Video CD playback, copy prevention and sometimes carry libraries for developers to use etc. Few of these cases, however, would qualify as a “true” operating system.

The most notable exceptions are probably the Dreamcast game console which includes a minimal BIOS, like the PlayStation, but can load the Windows CE operating system from the game disk allowing easily porting of games from the PC world, and the Xbox game console, which is little more than a disguised Intel-based PC running a secret, modified version of Microsoft Windows in the background.

Furthermore, there are Linux versions that will run on a PlayStation or Xbox and maybe other game consoles as well, provided they have access to a large mass storage device and have a reasonable amount of RAM (the bare minimum for a GUI is around 512 kilobytes, as the case of the Commodore Amiga or early ATARI ST shows. GEOS however ran on a stock C64 which came with as little as 64 kilobytes).

Long before that, Sony had released a kind of development kit called the Net Yaroze for its first PlayStation platform, which provided a series of programming and developing tools to be used with a normal PC and a specially modified “Black PlayStation” that could be interfaced with a PC and download programs from it. These operations require in general a functional OS on both platforms involved.

In general, it can be said that videogame consoles and arcade coin operated machines used at most a built-in BIOS during the 1970s, 1980s and most of the 1990s, while from the PlayStation era and beyond they started getting more and more sophisticated, to the point of requiring a generic or custom-built OS for aiding in developing and expandibility.

The personal computer era: Apple, DOS and beyond

The development of microprocessors made inexpensive computing available for the small business and hobbyist, which in turn led to the widespread use of interchangeable hardware components using a common interconnection (such as the S-100, SS-50, Apple II, ISA, and PCI buses), and an increasing need for ‘standard’ operating systems to control them. The most important of the early OSes on these machines was Digital Research’s CP/M-80 for the 8080 / 8085 / Z-80 CPUs. It was based on several Digital Equipment Corporation operating systems, mostly for the PDP-11 architecture. MS-DOS (or PC-DOS when supplied by IBM) was based originally on CP/M-80. Each of these machines had a small boot program in ROM which loaded the OS itself from disk. The BIOS on the IBM-PC class machines was an extension of this idea and has accreted more features and functions in the 20 years since the first IBM-PC was introduced in 1981.

The decreasing cost of display equipment and processors made it practical to provide graphical user interfaces for many operating systems, such as the generic X Window System that is provided with many UNIX systems, or other graphical systems such as Microsoft Windows, the RadioShack Color Computer’s OS-9, Commodore’s AmigaOS, Level II, Apple’s Mac OS, or even IBM’s OS/2. The original GUI was developed at Xerox Palo Alto Research Center in the early ’70s (the Alto computer system) and imitated by many vendors.

Computer History…

So when was the first computer invented? There is no easy answer to this question because of all the different classifications of computers. Therefore, this document has been created with a listing of each of the first computers starting with the first automatic computing engines leading up to the computers of today. Keep in mind that early inventions such as the abacus, calculators, and tablet machines are not accounted for in this document.

First mechanical computer or automatic computing engine concept

In 1822, Charles Babbage purposed and began developing the Difference Engine, considered to be the first automatic computing engine that was capable of computing several sets of numbers and making a hard copies of the results. Unfortunately, because of funding he was never able to complete a full-scale functional version of this machine. In June of 1991, the London Science Museum completed the Difference Engine No 2 for the bicentennial year of Babbage’s birth and later completed the printing mechanism in 2000.

Analytical EngineLater, in 1837 Charles Babbage proposed the first general mechanical computer, the Analytical Engine. The Analytical Engine contained an Arithmetic Logic Unit (ALU), basic flow control, and integrated memory and is the first general-purpose computer concept. Unfortunately, because of funding issues this computer was also never built while Charles Babbage’s was alive. In 1910, Henry Babbage, Charles Babbage’s youngest son was able to complete a portion of this machine and was able to perform basic calculations.

First programmable computer

The Z1, originally created by Germany’s Konrad Zuse in his parents living room in 1936 to 1938 is considered to be the first electro-mechanical binary programmable (modern) computer and really the first functional computer.

The first electric programmable computer

The Colossus was the first electric programmable computer and was developed by Tommy Flowers and first demonstrated in December 1943. The Colossus was created to help the British code breakers read encrypted German messages.

The first digital computer

Short for Atanasoff-Berry Computer, the ABC started being developed by Professor John Vincent Atanasoff and graduate student Cliff Berry in 1937 and continued to be developed until 1942 at the Iowa State College (now Iowa State University). The ABC was an electrical computer that used vacuum tubes for digital computation including binary math and Boolean logic and had no CPU. On October 19, 1973, the US Federal Judge Earl R. Larson signed his decision that the ENIAC patent by Eckert and Mauchly was invalid and named Atanasoff the inventor of the electronic digital computer.

ENIACThe ENIAC was invented by J. Presper Eckert and John Mauchly at the University of Pennsylvania and began construction in 1943 and was not completed until 1946. It occupied about 1,800 square feet and used about 18,000 vacuum tubes, weighing almost 50 tons. Although the Judge ruled that the ABC computer was the first digital computer, many still consider the ENIAC to be the first digital computer because it was fully functional.

The first stored program computer

The early British computer known as the EDSAC is considered to be the first stored program electronic computer. The computer performed its first calculation on May 6, 1949 and was the computer that ran the first graphical computer game, nicknamed “Baby”.

The first computer company

The first computer company was the Electronic Controls Company and was founded in 1949 by J. Presper Eckert and John Mauchly, the same individuals who helped create the ENIAC computer. The company was later renamed to EMCC or Eckert-Mauchly Computer Corporation and released a series of mainframe computers under the UNIVAC name.

First stored program computer

First delivered to the United States Government in 1950, the UNIVAC 1101 or ERA 1101 is considered to be the first computer that was capable of storing and running a program from memory.

First commercial computer

In 1942, Konrad Zuse begin working on the Z4, which later became the first commercial computer after being sold to Eduard Stiefel a mathematician of the Swiss Federal Institute of Technology Zurich on July 12, 1950.

The first PC (IBM compatible) computer

On April 7, 1953 IBM publicly introduced the 701, its first electric computer and first mass produced computer. Later IBM introduced its first personal computer called the IBM PC in 1981. The computer was code named and still sometimes referred to as the Acorn and had a 8088 processor, 16 KB of memory, which was expandable to 256 and utilizing MS-DOS.

The first computer with RAM

MIT introduces the Whirlwind machine on March 8, 1955, a revolutionary computer that was the first digital computer with magnetic core RAM and real-time graphics.

TransistorsThe first transistor computer

The TX-O (Transistorized Experimental computer) is the first transistorized computer to be demonstrated at the Massachusetts Institute of Technology in 1956.

The first minicomputer

In 1960, Digital Equipment Corporation released its first of many PDP computers the PDP-1.

The first mass-market PC

In 1968, Hewlett Packard began marketing the first mass-marketed PC, the HP 9100A.

The first workstation

Although it was never sold, the first workstation is considered to be the Xerox Alto, introduced in 1974. The computer was revolutionary for its time and included a fully functional computer, display, and mouse. The computer operated like many computers today utilizing windows, menus and icons as an interface to its operating system.

The first microprocessor

Intel introduces the first microprocessor, the Intel 4004 on November 15, 1971.

The first personal computer

In 1975, Ed Roberts coined the term “personal computer” when he introduced the Altair 8800. Although the first personal computer is considered by many to be the Kenback-1, which was first introduced for $750 in 1971. The computer relied on a series of switches for inputting data and output data by turning on and off a series of lights.

The Micral is considered the be the first commercial non-assembly computer. The computer used the Intel 8008 processor and sold for $1,750 in 1973.

The first laptop or portable computer

IBM 5100The IBM 5100 is the first portable computer, which was released on September 1975. The computer weighed 55 pounds and had a five inch CRT display, tape drive, 1.9MHz PALM processor, and 64KB of RAM. In the picture to the right, is an ad of the IBM 5100 taken from a November 1975 issue of Scientific America.

The first truly portable computer or laptop is considered to be the Osborne I, which was released on April 1981 and developed by Adam Osborne. The Osborne I was developed by Adam Osborne and weighed 24.5 pounds, had a 5-inch display, 64 KB of memory, two 5 1/4″ floppy drives, ran the CP/M 2.2 operating system, included a modem, and cost US$179.

The IBM PC Division (PCD) later released the IBM portable in 1984, it’s first portable computer that weighed in at 30 pounds. Later in 1986, IBM PCD announced it’s first laptop computer, the PC Convertible, weighing 12 pounds. Finally, in 1994, IBM introduced the IBM ThinkPad 775CD, the first notebook with an integrated CD-ROM.

The first Apple computer

Steve Wozniak designed the first Apple known as the Apple I computer in 1976.

The first PC clone

The Compaq Portable is considered to be the first PC clone and was release in March 1983 by Compaq. The Compaq Portable was 100% compatible with IBM computers and was capable of running any software developed for IBM computers.

See the below other major computer companies first for other IBM compatible computers

The first multimedia computer

In 1992, Tandy Radio Shack becomes one of the first companies to release a computer based on the MPC standard with its introduction of the M2500 XL/2 and M4020 SX computers.

Other major computer company firsts

Below is a listing of some of the major computers companies first computers.

Compaq – In March 1983, Compaq released its first computer and the first 100% IBM compatible computer the “Compaq Portable.”
Dell – In 1985, Dell introduced its first computer, the “Turbo PC.”
Hewlett Packard – In 1966, Hewlett Packard released its first general computer, the “HP-2115.”
NEC – In 1958, NEC builds its first computer the “NEAC 1101.”
Toshiba – In 1954, Toshiba introduces its first computer, the “TAC” digital computer.

A Writer’s Daily Tools…

As Writer’s we like putting pen to paper, so to speak.  Have you not wondered where the fountain pen, biro or pencil originated from?  These instruments which we use daily…

The idea of writing instruments designed to carry their own ink supply had been a theoretical idea, but it was not until the beginning of the 1700’s, that the theory was being put into practice.

THE FOUNTAIN PEN: The first fountain pen was designed by M.Bion a Frenchman in 1702, but it was more than a 100 years later in 1809. When American Peregrin Williamson, patented his design in 1809.

John Scheffer obtained a British patent for his design of 1819, for his idea of a half quill – half metal pen.

Then in 1831 John Jacob Parker patented his design of a self-filling pen, but it was the design of Lewis Waterman, who patented the first practical fountain pen in 1884.

If it had not been for early fountain-pen inventors; using the hollow channel of a birds feather to create an ink reservoir, to replace the constant use of dipping into an ink well. The Romans created a reed style of pen, from the hollow tubular-stems of marsh grasses. By cutting one end to create a nib or point, with which to write with, they filled the stem with ink, and squeezed the stems, thus forcing the fluid into the nib. We would never have started a creative revolution in the design of new pens. Another design was to use a reservoir made of hard rubber, and fit a metal nib at the bottom. Unfortunately this did not have the required effect of producing a smooth writing instrument.

It was Lewis Waterman an insurance salesman, who was inspired to improve upon the early fountain-pen design, by adding an air hole in the nib, and three grooves into the feed mechanism.  His mechanism consisted of the nib, which made contact with the paper, a feed to control ink flow.  The aim of the barrel was to hold the nib, protect the reservoir, which in turn the writer holds.

By now we had reached a stage in pen designs, where they all contained some form of internal reservoir to hold ink.

The pens would have an internal reservoir, which would consist of a self-filling rubber sack, opened at one end.  To fill with ink, the reservoir was squeezed by using an internal plate, and then the pen’s nib inserted into the ink.  As pressure was released so the reservoir would fill up.

The period between the late 1800’s through to the mid 1900’s, saw a battle as pen companies, each sought to become the brand leader in reservoir pen designs.  The earliest known design of that time would be the Eyedropper, which had no internal filling mechanism.  Most open by unscrewing a section of the pen, after which the barrel is filled with ink using an eyedropper.  As long as the seal was tight, no ink should leak out.

Parker introduced the Button Filler, which had an external button connected to the internal pressure plate.  Then Walter Sheaffer responded by designing the Lever Filler, a slight variation on the Button Filler, for it used an external lever, that fitted flush with the pen.  Back came Parker, not to be outdone, with their Click Filler, using two protruding tabs, to deflate reservoir, and they clicked when reservoir was full.  Then Waterman introduced the Coin Filler, with slot in barrel, and by use of a coin, one could deflate and fill reservoir.

Some of the early inks were known to corrode the steel nib tips, which led to the introduction of a gold tip.  However, gold also had its problems; it was too soft, for the purpose of writing.  To overcome this design flaw, they used Iridium (A hard yellow-white chemical element that occurs in platinum ores) on the tip of the nib.

Early nibs were available in straight, oblique and italic designs.  As the years went by, and the need to communicate grew, so did the demand in pens, and a larger selection of pen nibs; wider, longer and shaped.

Everything changed in the early 1950’s, with the introduction of a new range of fountain pens, without the need of a reservoir.  They would revolutise the design for the future.

The reservoir had gone, to be replaced with a disposable ink cartridge, originally made of glass, then later of a rubbery plastic.  When they arrived on the scene, they were an immediate success … sixty years on and they are still going.

THE BALLPOINT PEN: Laszlo Biro a Hungarian journalist, observed newspaper ink dried quickly, and was smudge free.  His creative juices were activated, and by 1938 he had invented the first Ballpoint pen.

The thicker ink used for newspapers would not flow unaided, which led to a small ball-bearing being fitted to the pens tip.  The idea was, as the ball rotated it collected ink from the reservoir and placed it on the paper.  So simple, yet so clever.

In 1940, Laszlo Biro and his brother George emigrated to Argentina, and applied for a new patent in 1943, and sold licensing rights to the British, as the Royal Air Force needed a pen that would not leak at high altitudes.  The success of this pen brought it to the forefront of pen design.

Laszlo and George Biro went on to form the Eterpen Company and commercialised the Biro pen, which was hailed as an ultimate success.  One of its main advantages was that it only needed re-filling once a year.

The Biro brothers neglected to apply for a U.S.Patent.  As World War Two was coming to an end, so a new battle was just starting: The Battle of The Ballpoint Pens.

For it was in May 1945, the Eversharp Company joined forces with Eberhand – Faber acquiring Biro Pens of Argentina and rebranded the product as Eversharp CA (Capillary Action).

Milton Reynolds saw the Biro Pen whilst in Buenos Aires, and returned to America with a few of them.  By October 1945, Reynolds had copied the Biro design, thus breaking Eversharp’s patent rights, and started the Reynolds Pen Company.  The release of this pen was an overwhelming success.  It was released on the 29th October 1945 priced at $12.50 and went on to sell $100,000 worth on the first day.

Eversharp sued Reynolds for breach of patent rights.

By December 1945, England’s Miles-Martin Pen Company had stepped in releasing their own design of the Biro Pen.

Advertisers claimed these pens would write for two years before the need to refill.  Sales rocketed, but it was not long before problems arose; some leaked, some worked some of the time, and others were known to fail all together.

The consumer was dis-satisfied with the ballpoint pen, and sales nose dived, and by 1951, the ballpoint pen died a consumer’s death.

In January 1954, Parker Pens tossed their hat into the ring, by introducing their version of the Ballpoint Pen, known as the Jotter, which worked, so the battle for the ballpoint pen had been won.  Then in 1957 they introduced a tungsten carbide textured ball bearing in their pens.

A French Baron named Bich, removed the h from his name, and started the BIC Pen Company in the 1950’s, and by the end of the 50’s had acquired 70% of the European market.

By 1958, BIC a major player in the pen market had acquired 60% of Waterman Pens and by 1960 owned Waterman Pens outright.

BIC Ballpoint Pen Company dominates the market, selling cheap pens, whilst the likes of Parker, Sheaffer and Waterman sell the expensive Ballpoint Pens.

THE PENCIL: In 1564, in an area of Seat Waite Valley in Borrow Dale, England; Graphite which is a form of carbon was discovered, and so the first pencils were produced.

The main break through into the world of pencil technology came in 1795, when French chemist, Nicolas Conte used a mixture of fired clay and graphite before housing it in a wooden case.

Pencils got their name from the old English word meaning ‘brush’.  Conte’s method of Kiln firing powdered graphite and clay allowed pencils to be made to any hardness or softness.  The variations have changed over the years: H – 2H – 3H – HB – B – 2B – 3B and so the list goes on.

In the early days, pencils were sharpened by means of knives, similar to that used to shape feather quills.  Then in 1828 a French mathematician, Bernard Lassimone applied for a patent on an invention to sharpen pencils.  Then in 1847 Therry des Estwaux invented the first manual pencil sharpener, and the design is similar to that found in most stationers to-day.

John Lee Love an American designed the “Love Sharpener” a simple design yet portable.  The pencil would be inserted into an opening within the sharpener and rotated by hand.  Shavings would remain within the sharpener.  This was patented in November 1897.

In the early part of the 1940’s Raymond Loewy the first electric pencil sharpener, released by the Hammacher Schlemmer Company.

According to French scientist Charles Marie de la Condamine, South American Indian Tribes were using a form of rubber for attaching feathers to their bodies.  So it was in 1736, he brought a sample to the Institute de France in its natural form, for further studies on its possible applications.

Sir Joseph Priestley scientist stated in 1770 that he had observed a substance being used to wipe pencil marks from paper.  This same substance was brought back to France by Condamine.  Early types of rubber had their limitations, for it had a tendency to rot and crumble.

In came Edward Naime an English engineer and in 1770 he introduced rubber onto the market.

Charles Goodyear stepped into the frame in 1839, with his process to stop rubber crumbling, and make it a long lasting product.  He named his process: Vulcanization, after Vulcan, the roman god of fire.  In 1844 he patented his process.  Now that rubber had become a stable product, rubber or erasers became a product for removal of pencil marks on paper.

What I find quite interesting though.  Before we started using rubber or erasers to remove pencil marks from paper, our ancestors had used breadcrumbs.

The Birth of Writing

When we look around to-day, at how things are, and how much our daily lives rely on the art of writing.  We have to wonder how difficult it must have been in those early times, before writing and the alphabet came into existence.

Primitive man, a nomadic race of people, whom we are descended from, lived on this world of ours some 30,000 or more years ago.  They left their story for future inhabitants to find, on the walls of caves, made up of pictures and symbols, cut into stone using shaped stone tools and bones, often coloured by natural dyes.

They moved around, following herds of animals; as their food moved, so did they.  Only when they became less nomadic in their lifestyle, and learnt to cultivate crops and raise herds of cattle, would some form of early language develop… the first steps in communication.  So the evolution of man had started; pictures to symbols and symbols to letters as the alphabet was developed.

When I think back to my early years, and being taught how to write, creating my first o then adding a side line and a tail to the right and creating an a.  It must have been a thrill to those men of learning who went on and created the very first alphabet.

They produced an early form of writing instrument, made out of stone, and sharpened, so they could scratch Rock Art pictures on the walls of caves and dwellings.  It could be anything from, family life, their offspring, crops and victories with cave men or animals. 

With the discovery of clay, early traders were able to record details of their trading using clay tokens with pictographs. 

Writing forms started out in 3500 BC, when the Sumerians, created their own unique style of Pictograms, which consisted of people or objects.  They found they needed more forms of images to express their meaning, which led to the Ideogram.  In time these symbols represented a word; Logograms.

An example of the changes:

You had four people, standing by a camel.  Instead of showing four separate images for each person and one for the camel, this would be replaced by an image of a single person and the sign indicating four, plus the camel image.

The Sumerians used a wedge-shaped tool, made from reed, to press signs into clay tablets they had developed.  This new writing system was called Cuneiform (Wedge-Shaped).

From these humble beginnings, they developed images to represent sounds, so as to create a record in their own spoken language.  Sounds equalled specific images, once achieved they took it a step further, and recorded for history, works of literature.

In 668-627 BC the Assyrian King; Ashurbanipal had libraries containing such works as the “Epic of Gilgamesh.”

The cuneiform writing system spread through the middle east, during its 3,000 year history, writing the sounds as used by many countries and their languages.  Which included Babylonian, Assyrian, Elamite and Hittite, just some of the fifteen, who used this system of writing.

An early writing system was in its early stages of creation on the island of Crete in 3000 BC.  By 2000 BC they had developed the phonogram-syllabic script.

Therefore all the indications were there, the Greeks possessed a writing system.  Sadly their culture, their lifestyle was destroyed by Dorian invaders around 1100 BC.

Ancient Egyptian Hieroglyphs, were discovered in 1988 by Gunter Dreyer the archaeologist at Abydos, south of Cairo.  Inscriptions were found upon; pottery, bones, tombs and clay seals.  Radiocarbon analysis performed on the finds, deduced they dated between 3400 and 3200 BC which would make them one of the oldest, if not the oldest example of Egyptian writing known to exist.  Some of the Hieroglyphs used in Egypt, were similar to the cuneiform, that they referred to objects or had their links to sounds.  Many are used by royalty and deity, as can be seen in the Valley of the Kings, where many Pharaoh’s have their pyramids and burial chambers. 

The word hieroglyph, is Greek in origin, and comes from the word hieros, and if we follow the route of the word it means sacred and carved stone.

Other types of scripts were developed by the Egyptians; “Hieratic” a hand written style produced between 2613-2160 BC, and used until 700 BC.  It was later replaced by the “Demotic” a popular abbreviated version (661-332 BC).

The earliest known styles, still in existence within China are believed to date back to the Shang Dynasty (1500-1050 BC).  Inscriptions have also been discovered, carved into oracle bones and upon Shang bronzes  dating from this period.  Egyptian hieroglyphs faded with the rigors of time, whilst Chinese versions exist in one form or another.

Seal scripts as developed around 221 BC, are still used as a seal, as a personal signature.  By 200 BC a Clerical script came into existence for the purpose of book-keeping, and Grass script for note-taking.

China’s highest written art-form has to be that of Calligraphy; produced by using a brush or quill.

The Phoenicians once belonged to the Aramaic people, and settled in Syria pre 1000 BC, and were established sea-faring traders.  The writing systems of the Phoenician and Aramaic people were similar.

The Aramaic people were suppressed and scattered by the Assyrian invasions of their lands, sometime after 732 BC.  By then, much of the Babylonian language and cuneiform writing system had been replaced by their own, before being lost …

Aramaic scripts spread across the Assyrian Empire through to the lands of Afghanistan, India and Mongolia.  From these small steps, new writing systems developed; modern Arabic, Hebrew, Persian scripts and Brahmin script as used in India.

The Aramaic script was the language of Jesus and his disciples.  In the 6th century AD, this script was still being used, for St.Mashtots introduced it as the new alphabet for the Armenian people.

The Arabic script of Islam, a descendant of the Nabatean.  These scripts first started appearing around 300 AD.

Phoenician had a direct connection with Hieratic and Demotic scripts of Ancient Egypt.  Once a standard style had been developed for its use, so the Koran a sacred text was written, and spread through North Africa, Asia, India and China.  It was halted in its path of crossing into the lands of Western Europe by Charles Martel who defeated the Saracen armies at Poitiers in 733 AD.

If we cross the Pacific Ocean, and come forward in time to AD 300-900 we reach the Maya civilisation in Central America.  It is here glyph pictograms have been discovered upon sculptures, pottery murals and public buildings, and are believed to date back to (AD 250-900) their Classic period.  Whilst other’s are known to belong to their Late Pre Classical period (400 BC – AD 250).  The inscriptions detail historical events, alliances, wars and marriages.

The Maya glyphs are made up of square blocks each with its own inscription, then placed in horizontal and vertical rows, and finally read from left to right.

The first known alphabet was developed around 1500BC, by the Semites in Syria and Palestine, using signs to show the consonants of syllables, using their own set of characters.

Around 1000BC the Phoenicians developed an alphabet which the Greek modified.  With written lines; left to right and they added symbols for vowels.  Now days all western alphabets, are based on the early Greek alphabet.

In the early days of writing, there was only uppercase lettering, until around 600AD, when lowercase was introduced, with finer writing pens for this use.

The earliest implements that resembled that of a pen and paper were developed by the Greeks, using a nib made of metal, bone or ivory.

For it was that the Grecian scholar, Cadmus who invented the written letter-text messages.

Indian ink was invented by the Chinese Philosopher; Tien-Lcheu in 2697BC, out of soot, lamp oil, gelatine of donkey skin and musk, and was commonly used by 1200BC.  Other cultures developed their inks using natural dyes, with berries for colour, plants and minerals.

With the invention of ink, came the introduction of parchment paper, created in 2500 BC by the Egyptians, made from a water plant; papyrus.  Which was used by early Egyptians, Romans, Greeks and Hebrews.

We now had paper and ink, but needed an effective way of transcribing it.  So it was the Romans who created a reed style of pen, from the hollow tubular-stems of marsh grasses.  By cutting one end to create a nib or point, with which to write with, they filled the stem with ink, and squeezed the stems, thus forcing the fluid into the nib.

By 400AD a stable form of ink had been developed, consisting of iron-salts, nutgalls and gum, which would remain in use for centuries.  When first applied to paper, it was a bluish-black in colour, turning truly black, then to a dull brown over the years.

A wood fibre paper had been invented in China around 105AD and brought to Spain by the Arabs in 711AD.

The writing instrument that dominated history was the quill pen, as that used by Calligraphists, first introduced in 700AD and made from bird feathers.  Goose feathers were most commonly used, swan feathers being scarce were classed as premium grade, and crow feathers used for straight lines.

Plant fibre paper became the primary medium for writing after the dramatic invention by Johannes Gutenberg of the printing press with wooden or metal letters in 1436.

Articles written by hand had resembled printed letters until scholars began to change the form of writing, using capitals and small letters, writing with more of a slant and connecting letters.  The running hand or cursive style of handwriting with Roman capitals and small letters (Uppercase and lowercase) was invented by Aldus Manutius of Venice in 1495AD, and by the end of the 16th century we had the twenty-six lettered alphabet as we know it to-day.

The history of writing in Britain started in the 5th century AD, with the Anglo-Saxons.  By the 7th century AD, the Latin alphabet had been introduced.

The Normans invaded our shores in 1066, and the English language was relegated to the poor, whilst nobility, clergy and scholars spoke and read Norman or Latin.  By the 13th century, the English language had become the most prominent language once again, having been influenced by two centuries of Norman rule.

The Acorn BBC Micro Computer

The BBC Micro computer was launched in December of 1981 as part of the BBC’s Computer Literacy Project. The Computer Literacy Project was created by the BBC (British Broadcasting Corporation) to increase computer literacy and to encourage as wide a range of people as possible to gain hands-on experience with a microcomputer. The BBC Micro was very successful in the UK, selling over 1.5 million units, and was widely used in schools, with the large majority of schools having one.

As part of the project the BBC wanted to commission the development of a microcomputer that could accompany the TV programmes and literature. The BBC created a list of required specifications in early 1981, with companies including Acorn Computers, Sinclair Research, Tangerine Computer Systems, NewBury Laboratories and Dragon Data showing interest in the project. Acorn’s bid won in March 1981 with their Proton prototype, which was being developed as the successor to the Acorn Atom. While the BBC Micro was launched in December 1981, production problems meant that deliveries of the computer were delayed.

The BBC Microcomputer, or the ‘Beeb’, is based on the 6502A microprocessor, which ran at 2MHz, and has 32K of ROM. The Model A shipped with 16K RAM and cost £299. The Model B shipped with 32K RAM and cost £399. The Model B featured higher-resolution graphics due to the higher RAM. Both models used the same circuit board, therefore making it possible to upgrade a Model A to a Model B. The machine’s high cost was compensated for by its impressive expansion possibilities including disc drives, a second processor and network capabilities (Econet).

The BBC Micro used the BBC BASIC programming language, a version of the BASIC programming language. It was created mainly by Sophie Wilson for the BBC Micro. 

The BBC Micro is housed in a case which includes an internal power supply and a 64-key keyboard with 10 additional user-definable keys. On the back of the case there are ports for UHF out, video out, RGB, RS-423, cassette, analogue in and Econet.

“For me this was the machine that really got me into programming and microelectronics. The BBC Micro was developed by Acorn computers for the BBC who were embarking on an education programme for the UK called the “BBC Computer Literacy Project”. The BBC made it their mission to have at least one of these machines available in every school in the UK.

The ‘beeb’ as it quickly became known as was fantastic for connecting to external equipment. It featured an anlogue ‘joystick’ port, a digital ‘user’ port, a 1Mhz bus connection, a ‘tube’ connection and a plethora of other connections. So many infact the the back of the machine ran out of space and they had to create a cut-away bay underneath the machine to accommodate them. But it was due to its connectivity and expandability that I really took to the beeb and started designing peripherals and software.

It was not a cheap machine. The BBC Model B sold for £399 on the high street in 1983 which was relatively expensive compared with other available machines like the Commodore 64 which sold for around £229. Regardless of the difference in price, because it was backed by the BBC, the beeb sold very well with over 1 million units sold.”

Manufacturer: Acorn
Date: 1981

Birth of the Computer

Who invented the computer, which is part of everyday use?  To get to this point we need to start back at the time, where it all began!

The earliest form of counting device would be the Tally Stick, but the one we all remember from our childhood days would be the Abacus first used in Babylonia in 2400BC.

The Astrolabe and the Antikythera Analog Mechanical Computers, were used in Ancient Greece in 150-100BC to perform astronomical calculations.  The Panisphere in AD1000 by Abu Rayhan al-Biruni. The Equatorium by Abu Ishaq Ibrahim al–Zarqali in AD1015 and the Astronomical Clock Tower of Su Song during the Song Dynasty AD1090.

In 1617, John Napier Scottish mathematician and physicist invented the Napier Bones, a device similar in appearance to that of an abacus, which could perform multiplication and division calculations.  In the 1620’s the Slide Rule was invented, a device to allow multiplication and division, using the basis of distances and line intervals to create the answer.  The use of the slide rule, faded out with the invention of the Pocket Calculator.

Wilhelm Schickard, A German designed the Calculating Clock in 1623, but it was destroyed by fire during construction in 1624, and the clock was never rebuilt.

In 1642 Blaise Pascal invented the mechanical calculator, and duly named it; Pascal’s Pascaline.

The Stepped Reckoner was invented by Gottfried Wilhelm von Leibniz in 1672, and came about, whilst using Pascal’s Pascaline machine.

The Frenchman, Joseph Marie Jacquard invented the Powered Loom in 1801.  It used Punched Wooden Cards, which defined the weaves pattern.  These wooden cards in relation to today’s world of computers would be the equivalent of a software program.

The Arithmometer invented by Charles Xavier Thomas de Colmar, was to become the first mass produced mechanical calculator in the 1820’s.  For it could add, subtract, multiply and divide.

In 1837 Charles Babbage invented the first Mechanical Computer, using his Analytical Engine.  The device was never finished during his lifetime, and it was left to his son Henry, to complete the work in 1888 in a simplified form: The Mill.

Ada Lovelace daughter of the poet, Lord Byron, was analyst of the Babbage analytical engine, and went on to create the first computer program between 1842-1843.  For it was her vision that computers would be capable of performing more than basic arithmetic calculations.

Herman Hollerith invented a machine to read and record punched cards; the Tabulator, and Keypunch machines towards the end of the 1880’s, like the Hollerith Desk, used by the U.S. to carry out the 1890 census.  He went on to open a Tabulating Machine Company on the back of its success, which eventually became International Business Machines (IBM).

Alan Turing considered by many as the father of computer science.  For it was in 1936 he provided the concept of algorithm and computation with the Turing Machine, and the blueprint for the first electronic digital computer.

Just think, when you turn on your computer, your actually using a design based on the brain child of Alan Turing.

In 1947, one Howard Aiken had been commissioned by IBM to determine how many computers it would take to run the United States… His answer was six.  How wrong had he been, who would have believed most homes would have at least one computer, some sixty-six years later.

In 1936 the first computer was built by Kenrad Zuse the Z1, believed to be the first electro-mechanical binary programmable computer.

In November 1937 whilst working at Bell Labs George Stibitz invented the Model K relay based calculator, which used binary circuits to perform calculations.

John Atanasoff a Physics professor from Iowa, built the first electronic digital computer in 1937, assisted by graduate student Clifford Berry.  It hadn’t been constructed as a programmable machine, for its main purpose was to deal with linear equations.

Konrad Zuse who built the Z1 back in 1936, took his invention to the next stage in 1941, by building the first program controlled electromechanical computing machine; the Z3.

Thomas Flowers joined the Post Office Research Branch in 1930, where he became Head of Switching Research. During the 1930s Flowers pioneered large-scale digital electronics.  Then in 1943 he designed and constructed the British Computer; Colossus.

Harry Fensom joined Flowers’ inner circle of engineers at the Research Branch of the British Post Office in 1942. He participated in the construction of the code breaking machine, Colossus, and was responsible for keeping it in continuous operation at Bletchley Park.

It was the world’s first electronic programmable computer, consisting of a large number of vacuum tubes.  Even though it had its limits, when it came to programming, its main use was in breaking German wartime codes.

In 1939 development started on the Harvard Mark I, an Automatic Sequence Controlled Calculator.  In fact it was a general purpose electro-mechanical computer by Howard Aiken and Grace Hopper, and financed by IBM.  It came into use in May 1944.

The ENIAC Mark I computer was the brainchild of John Presper Eckert and John W Mauchly in 1946.  The architectural design required the rewiring of the plug board to change its programming.  It was capable of adding and subtracting five thousand times a second, and had the added ability to perform, multiplication, divide and square root calculations.  It weighed in at thirty tons, used two-hundred kilowatts of power, and contained eighteen thousand vacuum tubes, fifteen hundred relays, and hundreds of thousands resistors, capacitors and inductors.

The Small-Scale Experimental Machine, also known as Baby, was completed in 1948 at England’s; University of Manchester based upon the stored-program architecture.  On the 21st June 1948, it made its first successful run of a program, using 32-bit word length and a memory of 32 words. 

The Manchester Mark I, a more powerful machine was built to supersede the Baby with expanded size and power, using a magnetic drum for auxiliary storage.

Later that year, Cambridge University built the Electronic Delay Storage Automatic Calculator, which was fitted out with a built-in-program.

The Government requested Ferranti to build a commercial computer based on the design of the Manchester Mark I in October 1948.  The Ferranti Mark I, included enhancements making it more powerful and faster.  The first machine was rolled out in February 1951.

John Presper Eckert and John W Mauchly, who designed and built the ENIAC Mark I computer, updated their design in 1951, with the release of the UNIVAC, for use by the U.S, Census Bureau.  It used 5,200 vacuum tubes, and consumed some 125kw of power.  Storage was by way of serial-access mercury delay lines.

In the early 1950’s Sergei Sobolev and Nikolay Brusentsov two Soviet scientists designed the Setun, a ternary computer that operated on a base three numbering system, (-1,0,1) rather than the conventional binary numbering system.  The computer was used within the Soviet Union, but its life short lived, and the architecture was replaced with a binary system.

In 1952, IBM released their first Electronic Data Processing Machine; IBM701, and its first mainframe computer.  Then in 1954, the IBM704 came onto the market, using a magnetic core memory.  During 1955-1956, IBM developed the Fortran programming language, for use with the IBM704, which was released in 1957.

In 1954, IBM produced a smaller computer the IBM650, weighing in at 900kg and the power unit at 1350kg.  At the time of construct it had a drum memory unit which could hold 2,000 words, later increased to 4,000 words with a maximum of ten letters per word.  The IBM650 used; SOAP (Symbolic Optimal Assembly Program).

Microprogramming was invented by Maurice Wilkes in 1955.

Then in 1956 IBM created the disk storage unit; the IBM350RAMAC (Random Access Method of Accounting and Control).  It used fifty 24-inch metal disks, with one hundred tracks per side, capable of storing five megabytes.

John Presper Eckert and John W Mauchly, recognized the limitations of the ENIAC, even before construction had been completed in 1951.  They started researching the possibilities where programs and working data, could be stored in the same area, on the same disk, at the time this would have been considered rather radical.

Equipment of the mid-1950’s transmitted data by acoustic delay lines using liquid mercury or a wire.  It worked by sending acoustic pulses represent by a “1 or 0” causing the oscillator to re-send the pulse.  Other systems on the market at the time used cathode-ray tubes, storing and retrieving data on a phosphor screen.

The Magnetic Core Memory, where each core equals one bit, was created in 1954, replacing many forms of temporary storage, and would go on to dominate the market for many years to come.

The Bipolar Transistor of 1947 went on to replace vacuum tubes from 1955.  The early versions were the Germanium Point-Contact Transistors, consuming less power, but reliability was an issue.

The University of Manchester, built the first transistorized computer in 1953, and the updated version was running by 1955.  It used two-hundred transistors, thirteen-hundred solid-state diodes, with a power consumption of 150 watts.  Whereas the Harwell CADET, had no tubes it had a tendency to crash every ninety minutes, but by changing to a more reliable bipolar junction transistor, they found crash times were reduced.

Upon comparing vacuum tubes and transistors, the transistors had many advantages, being smaller in design, requiring less power, which gave off less heat.  Transistorized computers contained tens of thousands binary logic circuits in a compact space.

With the creation of transistorized electronics, we saw the Central Processing Unit, within would be the ALU (Arithmetic Logic Unit), which performed arithmetic and logic operations the first of many devices which would show enhanced improvements. 

In a sense new technology had opened the flood gates to improved parts for the computer, where once they would have taken up the space of a large room, technology had seen them reduced in size, capable of sitting upon a table.  One invention would be the Data-Disk Storage Unit, capable of storing tens of millions letters and digits, alongside removable data disk storage units.  Input/output, a means by which a computer exchanges information.

Telephone connections went on to provide sufficient speeds for early remote terminals like the Teletype or Telex machine.

Who would have believed, that these stand-alone computers, one day would be the basis for the Internet.

Jack Kilby and Robert Noyce, designed the Integrated Circuit (Microchip) in 1958 which led to the invention of the Microprocessor.  Then in 1964 IBM released its Solid Logic Technology modules in Hybrid Circuits.

Intel opened its doors in 1968, and in their early days produced the semi-conductor memory, and went on to create DRAM and EPROM.  Intel developed their Microprocessor in 1971, and crammed an entire computer on a single chip.

They produced the Intel 4004, the first microprocessor consisting of 2300 transistors and clocked at 108 KHz.  They followed up with the 8008 and 8080 models.

The 8080 was used in MITS Altair computer kit.  This machine attracted one Bill Gates a Harvard freshman to drop out of college and write programs for the computer.

Alan Shugart and IBM invented the Floppy Disk in 1971, and nicknamed it the “Floppy” based on its flexibility of use.

The idea of computers co-ordinating information between one another had been around for years, using telecommunication technology.  Then in 1973 Robert Metcalfe and Xerox created the Ethernet Computer Networking System.

Olivetti, a company more associated with typewriters, presented to the world, their first personal computer the P6060 in 1975.  It had a 32-character display, 80-column thermal printer, 48 Kbytes of RAM and used BASIC language, weighing in at 40 kg.

In 1991 Bill Gates and Microsoft, supplied the world with MS-DOS an operating system to run the computer.  That same year IBM released their home computer, and so the home computer revolution had started.

In 1983 Apple released their home computer with a graphical user interface.  In 1984 Apple Macintosh released a more affordable home computer with graphical user interface.

In 1985 Bill Gates and Microsoft released their new Operating System which would revolutionise the computer for decades to come; Microsoft Windows, which has been upgraded over the years.  We have now reached Windows 10.

With the 1990’s came E-mail and the World Wide Web … and computers and the Internet would change our world for ever.