page contents

History of digital computing

After the invention of the computer in the 1940s, several technologies and methodologies were developed, a succinct history of which, we try to outline here. In each case, the first manifestations of each technology stand out; by the time of a few years (and sometimes months), the number of forks that occur is such that their treatment would require larger spaces than is available and more appropriate analysis and expression mechanisms.

A reflection is valid here: It has to do with some linear view of the history of computing that leads to consider it as a continuum that would go from the conception of Babbage, through the machine of Turing, to the developments of the 1940s , and from there to our day. Marches and countermarches, failed attempts, which resemble the tantations of biological evolution (with disappearing species, others developing complex balances, etc.) belies this simplistic appreciation. Important agents not considered complement that picture. A single firm, scarcely mentioned in the standard stories, the Northrop Aircraft Inc., produced no fewer than 14 companies and 23 types of computers after 1945. Richard Sprague claims that people and not corporations do computers. The roots of Northrop’s developments came from projects in aerospace, missile control since launching to target, better computer design and problem solving and human talent. However, the company’s financial problems, between 1948 and 1950, soon led to its decline. If Northrop could, the author says, to finance all the projects originated internally, it would be the IBM of the time.

Let’s look at some of the events.

  • In 1884, J. A. Fleming of Great Britain invented the diode based on an Edison experiment. It is an important electronic device as it allows the passage of electric current in one direction.
  • 1906. Lee of Forest of the United States, introduced a grid between the filament and the plate of a diode, obtaining the triode that functions as an amplifier.
  • The printed circuit invented in 1943 by the Englishman P. Eisler. The technique allows to replace certain components of a circuit with a two-dimensional path on an insulating support.
  • On 27 January 1947, at the Bell Telephonez Laboratories, the invention of the transistor was announced by the team led by William Bradford Shockley (1910 1989), composed by Walter Houser Brattain (1902 1987), John Bardeen (1908 1991). The transistor is a three-electrode amplifier that uses a semiconductor as a germanium or silicon. All three received the Nobel Prize in physics in 1956 and Bardeen received it for the second time in 1972. The theoretical bases of the transistor can be traced back to the works of Michael Faraday (1791 1867) in the 1830s. Indeed, Faraday observed that the electrical conductivity of silver sulfide increased when heated, behaviour contrary to that of metal conductors. In 1874, Karl Ferdinand Braun (1850 1918), professor of physicist of Marburg and Nobel Prize in physics in 1909, together with Guglielmo Marconi (1874 1937), discovered the glass rectifiers. The rectifiers were made with galena crystals, a lead sulfide mineral (PbS) and applied to the first radio devices. R. W. Pohl predicted, in 1933, that radio valves could be replaced by crystals in which the flow of electrons could be controlled.
  • In 1945, an article by Vanevar Bush (1890-1974) “As We May Think” appeared in the Atlantic Monthly, in which he developed the idea of a knowledge store, easily accessible and individually configurable. Douglas Engelbart and Ted Nelson were inspired by him for their conception of Hypertext.
  • 1948. The Massachusettinstitute of Technology (MIT) teaches the first civil operations research course.
  • During 1952 the Atlas computer of the Ferranti was released, which presented the concept of virtual memory, which was a larger memory than that of the main memory. The Atlas, however was not very successful in the market and Ferranti abandoned the idea for its following products. See appendix on “The Case of Virtual Memory”.
  • 1952. The Machines Research Corporation introduces ocr’s first OCR machine, Optical Character Recognition, that is, Optical Character Recognition.
  • 1953. The Compagnie de Machines Bull, the French punch card company, produced its first computer, the Gamma 3.
  • In 1954, the IBM 704 was released, deliveries from 1955. This computer had very interesting features. Among them is the memory of ferrite bulls conceived by Gene Myron Amdahl (n. 1922), who would later form his own company, Amdahl, Inc. It also used FORTRAN as a programming language, and was the first computer marketed with floating point (it involves a preset size mantissa and an exponent, for example, 314159265359E01 translates 0.314159265359 x 101, i.e. 3,14159265359. Thanks to this machine the use of index records was extended (the IBM 704 had 3), although they had appeared in 1949 in the Manchester Automatic Digital Machine (MADM) called B Box or B Register. A General Motors engineer, Bob Patrick, wrote the first monitor system for this machine.
  • 1956. The Bell Lab build Leprechaum, the first fully transistorized machine.
  • 1956. Professor Jacques Perret proposes the word “ordinateur” in response to a request from the IBM of France in 1954, to give the French language a term equivalent to Data Processing Machine or calculateur. In his letter to this company, Perret tells them that you are a properly formed word, which is found in the LITT-Ras as an adjective designating God who places order in the world. “A word of that genre has the advantage of easily giving a verb sort, a sort action name.” But it warns them of a minor inconvenience: Ordination designates a religious ceremony. However, the two fields of significance, religion and accounting are far away. “Furthermore, your machine will be computer (and not sorting) and that word is to every effect outside of theological use.”

After the success of the word, IBM decided to abandon its rights to it. Spain and then the Spanish-speaking world, uses the word “computer” to refer to the computer. Both words, however, seem to subsist in Spanish as synonyms, at least in Latin America.

  • 1958. Bank of America launches the first credit card.
  • 1958. Jack. Kilby, founder of Texas Instruments, patents the first integrated circuit.
  • In 1961, the MITRE Corporation, under Project 473 the United States Air Force, began work on an experimental transport planning implementation, later known as the Experimental Transport Facility (ETF). This application was significant in that it demonstrated that using externally described database features, two different applications could use the same set of generalized recovery (retrieval) implementations. Experience with this application and project 481 (Post Attack Command an Control System) indicated that the approach was valid. In August 1962, the ADAM (Advanced Data Management System) project was launched, completed in August 1966. He ran on an IBM 7030 (STRETCH), with 65K of memory. It had the ability to define, create, update, and interrogate files. The storage structure was sequential. The interrogation language, FABLE (First ADAM Basic Language), was close to English. Its format can be seen in the following metallic expression:

FOR file name .out[lista de entrada]p[cláusula booleana,]ut phrase/SAVE-list-list of items/ALL- [cláusula booleana]. [NAME nombre]

It had interfaces for all languages available for the MCP operating system. It could not be transferred (it was said then; today we would say that it has no “portability”) to other hardware or to another operating system. But as early as 1963, General Electric implemented the first version of IDS (Integrated Data Store) using an IBM 702 computer. Since July 1968, IBM had the NMCS (National Military Command System) S/360 operational. The origins of the system can be traced back to the work of the 438L system for the 702 and 709 computers of 1959. Until 1969, all database handlers examined by CODASYL (eleven) were hierarchical (tree mode).

  • In 1962, it was founded by brothers Ken and Sam Olsen and Harland Anderson, the Digital Equipment Corporation (DEC). Ken had worked on the Whirlwind project at MIT. The first computer in the PDP series, PDP 1 was ready that year.
  • Also in 1962, the MOS, Metal Oxyde Semiconductor circuits appeared. S. Hoftein and F. Heiman, U.S. A, they’re their inventors. The original design contained sixteen transistors, but the technology opened the door to large-scale integration, LSI, Large Scale Integration.
  • In the summer of 1964, General Electric took control of Compagnie de Machines Bull.
  • 1965. Robert Bremer creates ASCII (American Standard Code for Information Interchange), a 7-bit standard code. In it, all capital letters come before lowercase.
  • 1965. Digital Equipment Corporation, DEC, releases the PDP-8 computer. It is claimed that it “established the concept of minicomputers”, giving rise to a “multibillion dollar industry”.
  • 1966. The American National Standards Institute (ANSI) releases the OCR-A source.
  • In 1967 Fairchild Semiconductor was created by Eugene Kleiner, who worked with William Shockley, one of the transistor’s creators and later co-founder of Netscape, and Gordon Moore, co-founder of Intel in 1968, and author of the law that bears his name. Moore was slain to Hungarian scientist Andrew Grove. Rock Arthur was the investor who raised capital for Fairchild Semiconductor, helped raise capital for Intel in 1968, and invested in Apple in 1978. Robert Noyce released a standalone version of the microchip in 1959.
  • In September 1969, a Japanese company, ETI, commissioned Intel to manufacture a chip for use in its line of desktop calculators. Thus the 4-bit 4004 microprocessor was born in November 1971. The chip had been developed by Marcian Edward “Ted” Hoff (n. 1939) (architecture and instruction set), Federico Faggin (n. 1941), entered in April 1970, who later founded Zilog, Inc., detailed the layout and layout, and Stanley Mazor (n. 1941). The product contained more than 2100 transistors. The 8008 and 8080 microchips were on the market in 1975. The first of these developed for a company called Computer Terminal Corporation (CTC), and then Data Point.
  • In June 1970, IBM’s Edgar Codd published “A Relational Model for Large Shared Data Banks” in Communications of ACM. In 1971 C. J. Date’s book Introduction to Database Systems, published in Addison-Wesley, appeared. This implies the beginning of the triumphant technology database handlers. However, at the time, apart from hierarchical handlers there was a model called a network or plexus. Cincom Systems marketed a package called Total in the early 1970s (at least since 1972). IBM was competing at the time with IMS/DL 1, and was considered by the vendor to be hierarchical.
  • 1971. Steve Wozniak and Bill Fernandez build the “Cream Soda” computer.
  • 1971. Ray Tromlinson of the United States enters the at sign as a separator.
  • 1972. Texas Instruments produces the first pocket calculator. Intel’s Gary Kildall had written CP/M.
  • 1973. The company R2E, from France, manufactures the first microcomputer; his name was Micral. It was designed by Trung Trong Thi but represented a commercial failure.
  • 1975. THE MTIS (Micro Instrumentation Telemetry System), run by Ed Roberts, announces the Altair computer in Popular Eectronics, a machine based on the Intel 8080 chip, with 256 memory positions. The company lasted until May 22, 1977 when it was sold to Pertec.
  • Also in 1975, William Henry Stacey “Bill” Gates III (n. 1955) and Paul Allen (1953) founded Microsoft (reno. Traf-O-Data) and made their first software sale: The BASIC for the Altair.
  • In the same year 1975 the first issue of Byte magazine, created by Wayne Green, appeared. Its first editor was Carl Helmers.
  • In mid-July 1975, Dick Heiser opened a small shop in the west-west area of Los Angeles. His name was Arrow Head Computer Company. In smaller letters it read: The Computer Store. By the latter name it was recognized.
  • 1976. Steve Wozniak and Stephen Jobs founded Apple Computers (with the help of Mike Markkula). Jobs claims there were three events in personal computing: Apple in 1973, IBM’s PC in 1981, and McIntosh with a graphical interface in 1983. “The McIntosh cost $2,500, or $1,000 more than IBM’s PC. 19% of Adobe had to be purchased to improve print output. There were few applications.” Wosniak returned to Apple in 1997.
  • On September 21, 1976, Bill Millard founded ComputerLand and opened his pilot store in Hayward, California on November 10. The first franchise opened on February 18, 1977, in Morristown, New Jersey.
  • In the spring of 1977, Chuck Peddle introduced Commodore’s PET computer to the first West Coast Computer Faire, in San Francisco.
  • 1981, IBM introduces your personal computer, the PC.
  • On 24/06/1984, Apple Computers introduces the Macintosh.
  • 1986. Burroughs and Univac are the merger resulting in Unisys.

1987 the first mass virus infection occurs over MacIntosh. The virus was called Mac Mag or Peace Virus. John von Neumann had anticipated in 1949 the possibility of programs that reproduced themselves in “Theory and Organization of Complicated Automata”. This possibility is based on Alan Turing’s finite automata theory. However, there is broad agreement that the origin of technology goes back to Core War, a game program by A.K. Dewdney of the Department of Computer Science at the University of Western Ontario, in a paper for the Computer Scientific American in 1984. The game was inspired by the story of a previous program, called Creeper, whose only mission was to duplicate itself every time he ran. The infected corporation developed another program, called Reaper, whose purpose was to destroy copies of Creeper until their extinction; then, self-destruct. This story, which may be false, appears to be based on a computer game called Darwin, developed by M. Douglas McIlroy of AT&T Bell Laboratories, and Worm, a program written by John F. Shoch of Xerox Palo Alto. Darwin is described in Software Practice and Experience, Vol. 2, 1972, 93-96. A vague description of what appears to be the same game appears to be in the 1978 edition of Computer Lib. In Darwin each player provides a number of assembler programs called organisms. They attempt to eliminate and occupy the space of the organisms of the other species (i.e. those belonging to the other player). McIlroy invented an immortal program, but won few games (low aggression). Worm was created to optimize the use of a network of minicomputers. Their goal was to avoid interfering with large processes when someone needed to use a busy machine. Subsequently, the cited author and B. G. Jones provided mail guides to those who wrote to implement a “core war battlefield (alluding to the old ferrite bull memoirs) of their own.” The language in which it was programmed was called Redcode, similar to an assembler, and there was an executive program, Mars (Memory Array Redcode Simulator, but, in addition, Mars is the name of the god of war Mars), to interpret it. The rules of the game were as follows:

Rules of Core War

The rules of Core War are few and simple. The simpler the rules are, the simpler the referee program needs to be. Here are the rules we have been used:

  1. Two battle programs are loaded into CORE at randomly chosen locations, subject to the constraint that the programs cannot overlap.
  2. The battle proceeds as Mars executes one instruction from program X, one instruction from program Y, one from X, one from Y and so an, until one of two events:
  1. A previcusly specified number of instructions has been executed and both programs are still running. The battle is then declared a draw and ended.
  2. An instruction is encountered that cannot be interpreted by Mars and hence cannot be executed. The program with the faulty instructions is the loser.

Below is an excerpt of some concepts to place the game “on a more interesting level”:

Extensions to Redcode and Mars

The version of Mars presents here is easy to implement, and many of you may want to be how you can expand the Core War system. For example, in this version of Mars only two battle programs can be run at once. Would it be hard to let more programs execute? How about a new Redcode instruction that allows a running program to start up another program it has copied into a free area of CORE, and thereby increase the chance that at least one program from its “team” will survive the Core War battle?
The Redcode instruction set given here is a simple one. Those of you with access to a larger computer may want to experiment with new instruction sets and addressing modes, possibly making Redcode more like a real assembly language. Instructions that protect a larger program from a small, hard-to-defeat one would help to elevate Core War to a higher, more interesting level.  We would welcome documentation and listings of Core War systems and Redcode battle programs from anyone who thinks he has come up with a particularly interesting or innovative idea.

  • In the same year 1984, Fred Cohen gave the concept of computer viruses at a conference. in 1988, Robert Morris contaminated the Internet and Arpanet with a worm. About $100 million in losses. 1991 appeared the famous Michelangelo.
  • 1990. Microsoft releases Windows. Jobs comments on the product: “If Apple hadn’t created it, the graphical interface wouldn’t exist.”
  • 1991. Linus Torvalds creates Linux in Helsinski. It promotes the idea that all software should be free.
  • 1997. Intel releases the Pentium: 7.5 million transistors on one chip.


The first printers used in computers used technologies from conventional perforated card equipment. Printing was caused by impact of hammers hitting, through paper and print tape, on a type (number, letter or special character), mounted on a bar or wheel. 1960, IBM released the IBM 1403, associated with the IBM 1401 computer, which featured a string of 5 character sets that passed in front of a set of 132 hammers.

Magnetic tape drives appear as half-inch audio recording devices, reel to reel (reel to reel) in 1934, manufactured by the German company AEG using BASF tapes. By 1945, IBM had developed its own version. 1951 was introduced the Univac Uniservo, which used 1200-foot-long metal tape.

In 1981, IBM invented flexible disks (floppy disks) 8 inches in diameter and 400 Kbytes of capacity. 1956 IBM introduced the RAMAC 350, which stored 4.4 megabytes. 1962 the first removable disk package, 14 inches, 6 disks, 10 faces, and 3 million characters, the IBM 1311, for the IBM 1440. 1976, Shugart Associates produced the 5 1/4-inch floppy and 110 Kbytes capacity. The company went on to dispatch 4,000 units a day. Production was then moved to Matsushita in Japan, becoming the world’s largest manufacturer of flexible discs. During 1981, Sony introduced the 3 1/2-inch diskette with 438 Kbytes of storage. The capacity was revised to 1 Mbytes (720 KBytes formatted).

History of digital computing 1

The idea of optical storage dates back to 1927, when John L. Baird demonstrated a “Phonovision System”, based on a waxed disc displayed by an optical scanner. 1935, Baird Radiovision offered in stores by London department, 6 minutes of video display. 1961, 3M began working in optical recording and by 1965 had obtained several patents. 1974, Philips demonstrated a laser recorder and reader. 1978 demonstrated the system of audio compact discs (CDs), which made phonographic discs obsolete. The first optical devices for computers were shipped in 1983. These were 12-inch Write Once Read Many times. Audio CDs were the basis of CD-ROMs that began shiping in 1985.


Dr. Grace Brewster Murray Hooper (1906 1992) of the Eckert-Mauchly Computer Corporation led the first commercial effort in automatic programming. Univac I (released in 1951) was programmed into an overly rudimentary mnemonic code.

Flowmatic, a language that would be of great importance in the development of COBOL, was released in 1956. As ibm 704 project leader, John Backus (n. 1924) commissioned a team to develop a language for scientists, mathematicians and engineers. 1954 the paper was ready. Thus came the FORTRAN (FOrmula TRANslation), in April 1957, after 25 years-years-work. Subsequently, a VI version of FORTRAN was renamed PL/I in 1965.

In the spring of 1958 The Preliminary Report on an International Algebraic Language appeared, later called ALGOL.

In May 1959, at the initiative of the US Ministry of Defense, CODASYL, an acronym for Commitee on Data Systems Languages, was created. Specifications for the Backus-Naur Form or BNF appear in that year.

The first report on COBOL (Common Oriented Business Language), was produced in April 1960; improved implementation appeared in 1961 (COBOL 61).

In the early 1960s, IBM released RPG (Report Program Generator) to facilitate the conversion of conventional perforated card equipment to the IBM 1401. This reporting system was repeated for the IBM/360 model 20, IBM System/3 (RPG II), IBM System/38 (RPG III), etc. This is another test of the spread of working modes that, despite technological mutations and their possibilities, came from the old electromechanical systems.

In May 1964, John Kemeny and Thomas Kutz of Dartmouth College released BASIC (Beginner’s All-Symbolic Purpose Instruction Code). The project had initially been aimed at facilitating the teaching of computing to students of non-scientific careers, but its simplicity made it one of the most popular languages, first for time sharing and then for minicomputers. Bill Gates claims that BASIC did not become the most well-known and most accessible computer language just because it comes free with each machine. Among the attributes that the creator of Microsoft found him were the simplicity of using an interpreter, its powerful string handling, the richness of language, its keywords and English-like syntax, and the freedom it gives Programmers. Obviously, for current standards, some of these potential advantages are controversial. However, one fact is true, companies like Basic Four, for example, relied heavily on this language to forge their successes. Today, Microsoft maintains a version of this language: Virtual Basic.

Pascal, a structured language, developed by Niklaus Wirth (n. 1934) from the Institute for Informatik in Zurich, Switzerland, released in 1968. Pascal was the inspiration for several “Pascal-like languages”, such as the C, C+, Ada, etc.


The first microcomputers came on the scene in 1975, from companies such as MITS and Southewest Technical Products. His small memoirs only allowed them to work in machine language. Bill Gates and Paul Allen developed a version of BASIC for MITS’ 4-bit Altair machine. It was important, Gates recall, to use an “OK” prompt instead of “Ready” to save memory.

1979, Bob Frankston (1949) and Dan Bricklin (1951), two MIT graduates, created VisiCalc, the first spreadsheet (spreadsheet, but not originally named),) for Apple II. VisiCalc was programmed in assembler 6502. It had 254 rows by 63 columns. It followed two recalculation patterns: it swed columns from left to right and columns from top to bottom, frequently producing unsolvable circular references. The text did not extend beyond the columns. Frankston and Bricklin founded Software Arts on January 2, 1979 (later absorbed by Lotus). The software was quickly ported to virtually all personal computers based on 6502 and Z80 microchips. However, VisiCalc did not run on the CP/M operating system: This happened after Sorcim took that niche with SuperCalc. Microsoft responded with Multiplan. 1983, Lotus swept the market with 1-2-3


The Case of Virtual Memory (based on a corporate work by the author of September 1974).

What is virtual memory?

Virtual Storage is a storage technique that allows the Operating System to function as if it had virtually unlimited memory. The programs to be executed are loaded into the virtual memory which consists of a space in the actual memory (the classic central memory) and plus an additional space on a direct access device (magnetic disk to drum).

When programs are running, active parts of virtual memory are allocated to actual memory in small blocks or pages. To get a proper adjustment to the real space it is divided into frames (page frames). Pages in a given program can be in any available space in main memory, regardless of where they are in virtual storage. When the actual space occupied by a page is needed, it is returned to the magnetic drum disk. When the program requires it again a page demand will occur and it will be taken back from the direct access device to memory, where it will be stored in any available space.

How was the concept of virtual memory developed?

  • 1952. The Ferranti Atlas English computer ran multiple series of jobs(job streams), using a fixed-size page structure. This system proposed a page replacement policy attempting to detect cyclic behavior in reference to pages, minimizing page traffic by removing from main memory those that it did not expect to use for a long time. This method was successful for cyclic programs but its performance was not good for programs with random page demand. Its implementation was also very expensive.
  • 1952. The Gamma 60 from the Company Bull de France carried out work entirely on them from a certain number of peripheral processors. Apparently, its operation was not good as the French Railway Company changed, in 1965, this computer to a Univac 1107.
  • 1960, EMIDEC 2400 ran up to 24 independent parallel processes on that date. This was an all-asynchronous machine designed by N. Brown and his team (they are now part of the ICI group in England).
  • 1962. Burroughs successfully installs, in the United States, its 5000 series based on a fixed-head disk and swap of variable-length segments. As is well known, no work can be set a priori within a fixed length; not even within lengths that are multiples of a given fixed length. Citing E. J. Gaudion, we can say that “the size of the information is a separate variable that changes dynamically within instructions and groups of data, between statements and groups of data and also from application to application. From this we derive the law: Any fixed page size is automatically the wrong value”. As random as it is, Burroughs solved this problem by allowing the work segments to be of the length they should be (up to a maximum of 1024 words), which is not common in our day. According to the author quoted, for the first time in history, the “suffering users” got something important from the manufacturers. Burroughs provided an easy-to-use system. Returning to the quote: “Unfortunately, no one knew how good it was until the historic system/360 announcement and until the trauma of its job-control language became self-evident.”

Those who intervened in the concept of Burroughs were, among others, King, Glaser, Lonergan and Barton. The Burroughs 5000 also proposed novel ideas: the pointer. that prevents the transfer of large masses of information and the stack, simple. As a result it also gave birth to the first viable and “no penalty” system of multiprogramming. Later, to accommodate multithreaded and file sharing, came the concept of “cactus stack” that was successfully running in the 6500 series.

Programs in the B 5000 were segmented by ALGORITHML block-level compilers or COBOL paragraphs. The segment is used directly as a storage unit. Each segment was obtained when reference was made to information belonging to it. Each program in the system is associated with a program reference table (PRT). This table is the initial segment of each program, and a special record is set to point to the start address of the table. Each segment of the program is represented by an entry in this table which provides the base address of the segment and its extent the same time as an indication of whether or not the segment is that instant in the work storage.

Multiple replacement strategies were employed during system development but an essentially cyclical strategy is accepted as more effective.

The B 8500 is similar to the 5000 series. His most notable preview is his 44-word thin film memory. Sixteen words are used for data collection and instructions and twenty-four for temporary storage of the TPR and records used. the remaining four make up a queue.

  • 1962. Rice University has implemented a segmented space for its computer analogous to the B 5000. Your storage unit was the segment. These were initially placed sequentially in memory in a block of contiguous addresses, the first of which was a reference to the word segment code, words these that were used to provide a compact characterization of individual programs to data segments (very similar to descriptors to PRT elements of B 5000). When a segment loses its significance it is designated as inactive and its first word is filled with the block size and direction of the next inactive block. When a space is required, the inactive blockchain is sequentially scanned to find a sufficient-sized one. If this operation fails the search is done by a replacement algorithm that takes into account the existence of a copy of the segment in the backup memory and the use or not of the segment since the last time it was considered for replacement.
  • 1963. Univac, under the influence of Lonergan, recognized the importance of exchanging programs between main memory and an external medium, and its operating systems had, thereafter, the quick access to magnetic drum implemented. Unfortunately, although they divided the program into pieces that quickly distributed between drums and memory, they lacked a happy phrase as was “virtual memory”. In addition, the system required user intervention in the initial program division and operating system “turns” in the storage process.
  • 1965. The Control Data Corporation automates the process of storing pages using fixed-size pages and simplifies the initial split by partitioning into modules of equal size as the actual memory (the Digital Equipment Corporation had an idea approach to this approach).

The CDC 3300, of which we will talk in some detail, did not provide dynamic relocation, among other reasons, because it had a fixed table size for each program, without presenting the table space for page statistics and being the size of the space addresses that were available to the programmer, less than the main memory of the machine.

The system provided by CDC was fully implemented by hardware. Up to eight programs could run simultaneously but only one had control of the computer in a certain period. Multiprogramming monitoring controlled the progression from one program to another. To this end, it set the program register to indicate the program that had the control. The heart of the hardware system was a page archive that provided sixteen records for each of the eight programs. Each record provided a translation entry that related the segment number of a program and the block numara for that segment within memory. The segment number ranged from 1 to 16 and the addressable pages were 1289 having each page 2,000 words. When searching for a segment from the statement record, it could happen that the corresponding entry was indicating that the required block was absent. This caused an outage of the multiprogramming monitor, which could get the information from The Main Memory.

  • 1967. GE/Honeywell operationally implemented the MULTICS system, whose father was, among others, the aforementioned Glaser. This is the MULTiplexed Information and Computing Service, and was a joint project of the Massachusetts Institute of Technology (MIT), the laboratories of bell Telephone and the General Electric Company. The initial system was implemented in a GE 645 configured with two 128K core storage processors, four million words in drum storage, and sixteen million on disk. The system was designed to serve a large number of users. Unlike the B 5000, the segment in this system was not the storage unit. This is done using a variant of the standard system: The system has two page sizes that are 64 and 1024 words. This technique, at the cost of greater complexity in replacement strategies, reduces storage loss caused by fragmentation within pages. It uses two levels of indirection using a segment table and a group of page tables. Each entry in the segment table indicates the location of the page table for that segment. A small associative memory is used to maintain the addresses of newly accessed pages and thereby reduce the overtime caused by the routing process. The programmer could indicate data modules or programs that he wished to keep in memory, the information that by the frequency of use should, if possible, be in memory and the information that was not to be accessed again by the program, with which he helped the the process of demanding pagina.
  • 1967. IBM adapts the regular line of the 360 by adding a swapping mechanism, and was born the IBM/360-67, This was also a fixed-size page system, but was not named virtual memory. However, by that time, IBM is speaking developed an experimental computer designed and installed at the Thomas J. Watson Research Center. The basic hardware of this system, called M449 was a 7044 computer extensively modified mainly by the addition of approximately 200,000 core memory words. The access time was 8 microseconds. To each of the users who communicated with the system through a terminal, it “gave the impression” that it was using a separate computer (called 44X), with two million words of memory space, a complete repertoire of instructions and the skeleton operating system. This “virtual machine” was, in effect, controlled by a modular operating system (MOS), which ran on the M44. It also had an IBM 1301 disk file. Consisting of 9 million words and used as backup storage, the page size could be varied at system startup and this was done for experimentation purposes. Several replacement algorithms were also used. One of them, in particular, selected chance from several acceptable candidates based on their frequency of use and whether or not to modify the page. Although little use was made of this facility, so it could not be known how effective the M44 was, it had instructions to indicate whether a page was used within a short time or if it was not used within a certain time.

Going back to the IBM /360-67 model, let’s add that a typical configuration, if agreed to the supplied descriptions, would be as follows: 2 processors of 256 KBytes each, a 4 MB drum and about 500 MB on disk. Two versions were foreseeed from this computer: one with 24-bit addresses and one with 32-bit addresses. The available memory space was one million bytes with a maximum number of segments equal to 16 for 24-bit addressing and 4096 for 32-bit. In the first case, because of the limited number of segments, it may be necessary to place multiple programs within the same segment. It has an associative memory of nine records, eight of which are used to store table entries for newly accessed pages; the ninth is used to speed up the addressing of the instruction counter to a real physical address. It keeps an automatic record of the modifications that the information contained in the page frames suffers.

  • 1970. Radio Corporation of America was the first company to market the term “virtual memory”. The Spectra 70, almost a duplicate of the IBM /360, used a drum and fixed-size pages. The communication to the ACM of G. Oppenheimer and N. Weizer of the RCA announces: “The TimeShare Operating System (TSOS), of the Spectra 70, is designed to control a batch timeshare and multi-programming system that supports up to 48 conversational users and a combined group of 64 batch and interactive tasks, processing simultaneously.” According to these authors, its pages were 4 KB, the drum has a capacity of 800 to 1600 pages, with a transfer rate of 333 KB, physical memory of 131 or 262 KB and two million bytes of virtual storage. Of course, the Spectra 70/46 virtual model had built-in memory address translation hardware.

This sketch concludes with IBM and Burroughs’ announcements about virtual memory availability on their 370 and 1700 systems respectively. From Burroughs, we transcribed this laconic expression appeared in the B 1700 System Reference Manual: “Virtually infinite memory…”.

History of digital computing 2

This historical journey leads us to understand that the concept of virtual memory was about 18 years old in the market before being announced as such. We reread our notes on the Ferranti Atlas: “Physical storage was a 98,304-word drum. The pages were 512 words. The replacement strategy was based on a learning program. This program makes use of information that records the time since the last time the page was accessed in a box and the previous inactivity duration of that page”. All the fundamental elements of virtual memory are there.


· Byte staff, “Micro, Micro: Who Made the Micro?”, Byte, January 1991, 305-312.
· CODASYL, A Survey of Generalized Data Base Management Systems, May 1969.
· Dewdney, A.K., “In the game called Core War hostile programs engage in a battle of bitd”, Scintific American, May 1984, 15-19.
· Evans Christopher, The Making of teh Michro, New York, Van Nostrand Reinhold, 1981.
· Freiberger Paul, Swaine Michael, Fire in the Valley, Berkeley, Ca., Osborne/McGraw-Hill, 1984.
· Gaskin Robert R., “Paper, Magnets, and Light”, Byte, November 1989, 391-399.
· Gates Bill, “The 25th Birthday of Basic”, Byte, October 1989, 268-276.
· Keep Christopher, McLaughlin Tim, Parmar Robin.
<http:””>1993-2000.</http:> Inquiry: 19/12/2002.
· Kidder Tracy, The Soul of a New Machine, Boston, Atlantic-Little, Brown, 1981.
· Licklider Tracy Robntt, “Ten Years of Rows and Columns”, Byte, December 1989, 324-331.
· Moore Business Forms, The World of Optical Character Recognition, Moore Business Forms Inc., M4045, Niagara Falls, New York, 1971.
· Moreau R., Ainsi Naquit l’Informatique, Paris, Dunod, 21982.
· Neibauer Alan, “A History of Programming Languages”, 80 Micro, July 1983, 228-236.
· Reid-Green Keith, “History of Computers. The IBM 704”, Byte, January 1979, 190-192.
· Sweet Frank, “What, if Anythig, is a Relational Database?, Datamation, July 15 1984, 118-124.
· Von Neuman John, “General theory and Logic Theory of Automata, 1951, in Zenon W. Pylyshyn (ed.), Perspectives of the Computer Revolution, Madrid, Alliance, 1975.