Informatics Practices for Class XI (CBSE)/Computer System Organization

Note: Based on the History of Computers book

From the dawn of time, human beings have always tried to find new ways to solve problems, be more productive, work with numbers faster, and have better ways of storing information. Possibly early humans used stones to count items, which lead to the abacus, then to the slide-rule, and then later calculators. These machines allowed human beings to do these things faster and better than they could do them in their minds.

Early Computers

edit

The Mechanical Age

edit

Since the early ages of computer history there has been innovations that have led to the advancement of technology. The first computers were mechanical, and sometimes prone to errors. They were calculating machines. Blaise Pascal built a numerical wheel adding machine in 1642 in order to help out his father, who was a tax collector. It was a heavy burden to add numbers by hand, and Pascal had seen it as a chance to relieve that burden.

In 1673 Gothfried Willhelm von Leibniz, a German mathematician, built a calculator device that could add, subtract, multiply, and divide. It provided more functions than Pascal's machine and allowed users of it to solve more problems. Yet both Pascal's and Leibniz's machines were not totally dependable and suffered from flaws.

Mechanical Innovations

edit

Joseph Jacquard, a French weaver, designed a punch card loom in 1805. A chain of punch cards in an certain order, provided instructions for the loom to control it. This allowed patterns in the weave as the machine weaved threads. The pattern could be changed by changing the cards used to different cards. This later lead to storing computer instructions on these cards.

Charles Xavier Thomas, another Frenchman, worked on a new mechanical computer. He called it the four-function machine and it was more reliable than Pascal's or Leibniz's machines. This was in 1820 as technology had progressed, and Thomas learned from Pascal's and Leibniz's works and flaws.

Larger Scale Mechanical Computers and Logic

edit

Charles Babbage and Ada Lovelace made contributions since 1842. The Difference Engine, the machine that became the template for the Analytic Engine, was an automatic logarithm tabulator and printer. It had a memory unit, automatic printout, sequential program control and punch-card input. The punch card idea was borrowed from Jacquard's loom.

Babbage had worked with computers for 20 years with the British government, and the government was threatening to withdraw funds because it had nothing to show for its investments. The project needed someone new to help out, and enter Ada Lovelace, daughter of Lord Byron and Lady Annabella Milbanke. Lovelace corrected some of Babbage's mistakes in the instructions and became the world's first debugger. It was a milestone for women in computer history. Lovelace suggested a binary system of numbers be used, which set the standard of future computers to use.

Sadly the Difference Engine did not function properly. The technology to create proper gears and shafts was not good enough to provide accuracy. Yet it helped pave the way for future computers. Later the IBM Corporation was able to build a working model of the Difference Engine using more modernized parts.

Early Electronic Computers

edit

Early Programming Languages

edit

The first version of a programming language arose from work by Ada Lovelace, the benefactor and business partner of Charles Babbage. Unfortunately her work went mostly ignored since Babbage never built a completed Analytical Engine so there was no public deployment. She is remembered in the programming language still in use on military-grade projects, Ada.

Wiring and Raw Binary

edit

Early work with analog and electromechanical computers did not involve programming languages the way we know them. Since early computers needed to be wired directly for each problem set, the process of setting plugs took the place of having a text-based distillation of an algorithm.

As electromechanical machinery gave way to mercury delay lines and drum memory, it became possible to write directly to addresses in memory and provide instructions without rewiring. This typically meant writing what we would call "machine code". This often gets called "hex" today, as modern 32-bit and 64-bit microprocessor systems read binary data in 8 or 16 hexadecimal chunks per clock cycle.

Writing code at the machine level is difficult: it requires that the programmer know the specific locations for registers in hardware and the instruction set for the processor. Reading machine code is often more difficult than writing it, thus tracking code changes is nearly impossible for a human.

Assembly

edit

The first upgrade to the machine code level of programming was assembly language. This provided a way to write machine code with string manipulators and names for instructions instead of the raw binary versions. It can still be difficult to read and still requires knowing which instructions and register locations exist, but it can be read on paper or screen and assembled into machine code cycle by cycle.

Assembly programming has not gone away. Most device drivers for computer peripherals are written in C code, but some real-time glitches are best resolved by hand optimizing the mid-step assembly output from the C compiler. This is becoming increasingly rare.

FORTRAN

edit

FORTRAN Stands for FORmula TRANslation. This language was invented at IBM in the mid-1950s for the IBM 704 series computer.

BASIC

edit

BASIC stands for Beginners All Purpose Symbolic Instruction Code

In computer programming, BASIC (an acronym for Beginner's All-purpose Symbolic Instruction Code[1]) refers to a family of high-level programming languages. It was originally designed in 1963, by John George Kemeny and Thomas Eugene Kurtz at Dartmouth College, to allow students not in science fields to use computers. At the time all computer use required writing custom software, which was something only scientists and mathematicians tended to do. It became widespread on home microcomputers in the 1980s, and remains popular to this day in a handful of heavily evolved dialects.

COBOL

edit

COBOL, an acronym that stands for COmmon Business Oriented Language, is a high-level programming language developed in the 1960s and still used in business applications. It is used extensively in the financial services industry for large scale mainframe based applications. It uses instructions resembling English statements and imposes an overall framework for a program. The design goal for COBOL was a language that self-documented so that it could be revised and maintained easily.

PL/1

edit

Programming Language 1 is a high-level programming language designed for scientific, engineering, and business applications. It is one of the most feature-rich programming languages and one of the very first in the highly-feature-rich category. It has been used by various academic, commercial and industrial users since it was introduced in the early 1960s, and is still actively used today. It supports recursion and structured programming. The language syntax is English-like and suited for describing complex data formats, with a wide set of functions available to verify and manipulate them.

Early 4GL Systems

edit
  • MARK IV - source level specified data extraction from existing files
  • NATURAL - a language for extracting data from ADABAS files
  • IBM RPG - a "Report generator" language that could also generate other types of applications

Analysis and Design

edit

Early Operating Systems

edit

Mainframe Operating Systems

edit
  • IBM BOS
  • IBM TOS
  • IBM DOS
  • IBM MFT
  • IBM MVT
  • IBM MVS
  • IBM VM
  • ICL GEORGE

Personal Computer Operating Systems

edit
  • QDOS
  • CP/M
  • MS-DOS

Operating system 'wars'

edit

When the PC was introduced, it needed an operating system. IBM approached a company named Digital Research, which was owned by Gary Kildall. IBM sought the use of Digital Research's CP/M, a popular operating system in earlier systems. (It was, in fact, the first operating system that wasn't hardware-specific.) IBM did not want to pay royalties, however, but sought a one time purchase, which included a rename. Digital Research refused, and IBM withdrew. They then approached Microsoft and Bill Gates, who purchased an existing operating system (Seattle Computer Company's 86-DOS) and renamed it MS-DOS. This name was later used on non-IBM models; Microsoft agreed to IBM's desire to use their own name, and the operating system was sold as PC-DOS on the PC.

86-DOS was modeled after CP/M, and Digital Research filed legal action for patent infringement. IBM settled by offering computer buyers a choice of either; however, CP/M-86 (as the PC version was named) cost almost $200 more than PC-DOS, and it did not sell well.

MS/PC-DOS quickly became the standard for the PC-compatible market. Digital Research would attempt to regain the market, eventually settling on an MS-DOS clone, DR-DOS. DR DOS was sold off the shelf (while MS/PC-DOS was only sold bundled with new computers), and would later gain a large market share with version 5, which had new memory management that broke down an early limitiation of DOS, a maximum usable memory of 640 kB.

By this time, Microsoft was holding the market not only with MS-DOS, but Microsoft Windows, a graphical shell program for DOS. Windows was based on the Macintosh, and Apple filed suit. Complicating the matter was a suit against Apple by Xerox, claiming that Xerox was the rightful owner of the design. Eventually, it was ruled that the design factors in question could not be copyrighted, and Macintosh and Windows continued to coexist.

In 1995, Windows was re-worked to be a self contained operating system, Windows 95. By this time, DR-DOS had been sold twice, becoming Novell DOS 7, then Caldera DR-DOS 7. IBM had also split from Microsoft and was developing PC-DOS 6 separately. The new version of Windows that didn't coexist with DOS was ultimately the focus of an anti-trust lawsuit against Microsoft. Despite this, Microsoft was able to continue developing Windows.

Today, the market is dominated by the IBM PC-compatible computer, the majority of which run Microsoft Windows. Also present is an up-and-coming system, Linux, which is an open source system based on UNIX (an alternate PC-compatible system dating to the late 1970s; it was more complex and used for industrial, rather than home, use). On a separate platform, the Apple Macintosh also exists, running the newest Apple operating system, Mac OS X.

  • OS2/ WARP

Early Database Systems

edit
  • ADABAS
  • DL/1

Early 'Utility' Systems

edit

Sort/Merge

edit

Printer Spooling

edit
  • FIDO - for IBM DOS
  • JES2 - for IBM MVS
  • JES3 - for IBM MVS

Access Methods

edit
  • BTAM - Basic telecommunications access method
  • BATS - Basic additional teleprocessing support
  • TCAM - Telecommunications Access Method
  • QTAM - Queued Telecommunications Access Method
  • VTAM - Virtual Telecommunications Access Method

Teleprocessing Systems

edit
  • FASTER - First Automated Teleprocessing Environment Reponder
  • MTCS - Minimum Teleprocessing Control System
  • CICS - Customer Information Control System
  • Intercom -
  • Browns Operating System -

Early Application Systems

edit
  • "Works Records System " - a multi-user interactive spreadsheet application - produced at ICI Mond Division in Northwich, Cheshire, UK c1974

Early Innovations

edit

The Vacuum Tube

edit

The vacuum tube, also known as a valve because it controls electron flow the way a washer valve controls water direction, was originally developed to amplify radio signals. Its use as a logic gate, which practical, required massive power consumption and a steady supply of replacements.

Tubes were not meant to be turned on and off rapidly, thus they would burn out regularly on computers but can last decades on stereo and musical instrument amplifiers.

Vaccum tubes need a lot of electricity and also produce a lot of heat so they need plenty of ventilation or air conditioning.

The Transistor

edit

Shockley, Bardeen and Brattain's invention of the junction transistor resolved the vacuum tube limitations and in the process started the shrinking of computers.

The transistor is a solid-state logic gate switch: it does not need to heat a filament nor make a change in physical contacts to cause a change in electron flow. Its centerpiece is a semiconductor, a solid material that can be a conductor or resistor depending on the adulteration of the substrate.

The original substrate for transistors was germanium. The move to silicon, one of the most readily available chemicals on Earth, made transistors cheap to manufacture and eventually made computers ubiquitous.

The Integrated Circuit

edit

There are several types of transistors which combine to make full logic circuits.

The Microchip

edit

Vaccum tubes were replaced by the transistors and a number of 1000 transistor were replaced by 1 microchip. Thus microchip were discovered.

The Early Microcomputer

edit

Altair

edit

The MITS Altair 8800 was designed around a new microprocessor, the Intel 8080, and debuted in 1975. Its announcement on the January 1975 issue of Popular Electronics magazine led the Albuquerque-based manufacturer to a four-month backlog attempting to fill orders.

The Altair is considered the first home computer. Its only input was a series of front-panel switches. However, it had a motherboard with a bus, which allowed other companies to provide keyboards, tape readers and other devices to access the registers.

A group of Harvard students wrote a BASIC interpreter for the MITS. These students -- Steve Ballmer, Paul Allen, and Bill Gates -- left school to focus on code development and not long afterward created MicroSoft.

Apple II

edit

The Apple II was produced by Apple Computer. It used a BASIC-type operating system and was marketed toward home use.

Radio Shack

edit
  • TRS-80

Commodore

edit

Commodore Pet

edit

Commodore Vic 20

edit

Commodore 64

edit

The Commodore 64 was a revolution at the time it was released in 1981, expanding on the Vic 20's 8 colors to an amazing 16 colors and increasing the screen resolution to 40 columns and 25 lines. A 3 voice synthesizer topped off the features of this amazing computer.

In its time. the Commodore 64 was one of the most popular home computers in the market with thousands of games and business applications available. (I still have a copy of Microsoft Multiplan for the C64)

Its built-in BASIC and a plethora of magazines (ZZap64, Commodore) with programs for readers to enter, enabled many people to learn BASIC and even Machine Code.

In 1986, the C64C was introduced, its changes (a lighter colored case and a different shape) were mainly cosmetic but still compatible with all previous add-ons and software.

Another version of the C64 was the SX-64. This was a "portable" version of the Commodore 64. I use the term portable very loosely, this thing was heavy. It contained a 5-inch color Cathode Ray Tube (CRT) monitor and a built-in disk drive. The keyboard doubled as the lid of the unit.

Commodore64 was a must for its time.

This computer offered:

  • An amazing 64K of RAM
  • Built-in BASIC
  • Built-in TV adapter
  • A cartridge expansion port
  • 3-channel audio (via the SID chip)
  • 2 x D 9-pin input ports
  • Serial port
  • Optional external floppy drive (the 1541)
  • Optional tape drive (normally supplied with the computer)

Commodore 128

edit

Commodore Plus/4

edit

Amiga

edit

The Amiga microcomputer, compared to other computers of its age, was much more advanced in the areas of graphical processing, display, and manipulation. At the release of the original Amiga computer, the Amiga 1000, it was the only machine capable of displaying 12 bit color, using a format known as HAM-12, or Hold And Modify-12. This made it the only machine, for several years, to have as many as 4096 colors on screen at a time. The machine was replaced by the Amiga 500, a slightly enhanced and stripped down version of the Amiga 1000, with some of the same capabilities, as well as some different expansion ports. After the Amiga 500, the Amiga 2000, 3000, 600, 4000 and 1200 machines were released, each having minor enhancements over the previous. Currently, the Amiga corporation has handed off development of a new model, titled the Amiga One, to a separate company. This new machine would support speeds of up to 800 MHz, as opposed to the Amiga 4000, which had a top speed of 60 MHz, not counting the PPC enhancements which were released near the end of the Amiga 4000 lifecycle, which did not do well, due to lack of support from programming companies.

Atari

edit

The IBM PC was a personal computer built around the Intel 8088 microprocessor (which was modeled after the Intel 8086). It became standardized, and today, the majority of personal computers are IBM compatible.

Apple Lisa & Macintosh

edit

Apple Computer followed the Apple II with the Apple Lisa, and after that, the Apple Macintosh. These computers were unique due to their graphical user interface (GUI), introducing the concept of icons, windows, pull down menus, and mice. (The GUI concept was actually taken from Xerox, via two of their computers, the Xerox Star and Xerox Alto).