History of Computers/Programming Languages Evolution
The first version of a programming language arose from work by Ada Lovelace, the benefactor and business partner of Charles Babbage. Unfortunately her work went mostly ignored since Babbage never built a completed Analytical Engine so there was no public deployment. She is remembered in the programming language still in use on military-grade projects, Ada.
Wiring and Raw Binary edit
Early work with analog and electromechanical computers did not involve programming languages the way we know them. Since early computers needed to be wired directly for each problem set, the process of setting plugs took the place of having a text-based distillation of an algorithm.
As electromechanical machinery gave way to mercury delay lines and drum memory, it became possible to write directly to addresses in memory and provide instructions without rewiring. This typically meant writing what we would call "machine code". This often gets called "hex" today, as modern 32-bit and 64-bit microprocessor systems read binary data in 8 or 16 hexadecimal chunks per clock cycle.
Writing code at the machine level is difficult: it requires that the programmer know the specific locations for registers in hardware and the instruction set for the processor. Reading machine code is often more difficult than writing it, thus tracking code changes is nearly impossible for a human.
The first upgrade to the machine code level of programming was assembly language. This provided a way to write machine code with string manipulators and names for instructions instead of the raw binary versions. It can still be difficult to read and still requires knowing which instructions and register locations exist, but it can be read on paper or screen and assembled into machine code cycle by cycle.
Assembly programming has not gone away. Most device drivers for computer peripherals are written in C code, but some real-time glitches are best resolved by hand optimizing the mid-step assembly output from the C compiler. This is becoming increasingly rare.
FORTRAN Stands for FORmula TRANslation. This language was invented at IBM in the mid-1950s for the IBM 704 series computer.
In computer programming, BASIC (an acronym for Beginner's All-purpose Symbolic Instruction Code) is a family of high level languages. It was originally designed in 1963, by John George Kemeny and Thomas Eugene Kurtz at Dartmouth College, to allow students not in science fields to use computers. At the time all computer use required writing custom software, which was something only scientists and mathematicians tended to do. It became widespread on home microcomputers in the 1980s, and remains popular to this day in a handful of heavily evolved dialects.
COBOL, an acronym that stands for COmmon Business Oriented Language, is a high-level programming language developed in the 1960s and still used in business applications. It is used extensively in the financial services industry for large scale mainframe based applications. It uses instructions resembling English statements and imposes an overall framework for a program. The design goal for COBOL was a language that self-documented so that it could be revised and maintained easily.
Programming Language 1 is a high-level programming language designed for scientific, engineering, and business applications. It is one of the most feature-rich programming languages and one of the very first in the highly-feature-rich category. It has been used by various academic, commercial and industrial users since it was introduced in the early 1960s, and is still actively used today. It supports recursion and structured programming. The language syntax is English-like and suited for describing complex data formats, with a wide set of functions available to verify and manipulate them.
Early 4GL edit
- MARK IV - source level specified data extraction from existing files
- NATURAL - a language for extracting data from ADABAS files
- IBM RPG - a "Report generator" language that could also generate other types of applications