History of computing
The history of computing is longer than the history of computing hardware and modern computing technology and includes the history of methods intended for pen and paper or for chalk and slate, with or without the aid of tables.
History of computing |
---|
Hardware |
Software |
Computer science |
Modern concepts |
By country |
Timeline of computing |
Glossary of computer science |
|
Concrete devices
Digital computing is intimately tied to the representation of numbers.[1] But long before abstractions like the number arose, there were mathematical concepts to serve the purposes of civilization. These concepts are implicit in concrete practices such as :
- One-to-one correspondence,[2] a rule to count how many items, e.g. on a tally stick, eventually abstracted into numbers;
- Comparison to a standard,[3] a method for assuming reproducibility in a measurement, for example, the number of coins
- The 3-4-5 right triangle was a device for assuring a right angle, using ropes with 12 evenly spaced knots, for example.[4]
Numbers
Eventually, the concept of numbers became concrete and familiar enough for counting to arise, at times with sing-song mnemonics to teach sequences to others. All known human languages, except the Piraha language, have words for at least "one" and "two", and even some animals like the blackbird can distinguish a surprising number of items.[5]
Advances in the numeral system and mathematical notation eventually led to the discovery of mathematical operations such as addition, subtraction, multiplication, division, squaring, square root, and so forth. Eventually the operations were formalized, and concepts about the operations became understood well enough to be stated formally, and even proven. See, for example, Euclid's algorithm for finding the greatest common divisor of two numbers.
By the High Middle Ages, the positional Hindu–Arabic numeral system had reached Europe, which allowed for systematic computation of numbers. During this period, the representation of a calculation on paper actually allowed calculation of mathematical expressions, and the tabulation of mathematical functions such as the square root and the common logarithm (for use in multiplication and division) and the trigonometric functions. By the time of Isaac Newton's research, paper or vellum was an important computing resource, and even in our present time, researchers like Enrico Fermi would cover random scraps of paper with calculation, to satisfy their curiosity about an equation.[6] Even into the period of programmable calculators, Richard Feynman would unhesitatingly compute any steps which overflowed the memory of the calculators, by hand, just to learn the answer.
Early computation
The earliest known tool for use in computation is the Sumerian abacus, and it was thought to have been invented in Babylon c. 2700–2300 BC. Its original style of usage was by lines drawn in sand with pebbles. Abaci, of a more modern design, are still used as calculation tools today. This was the first known computer and most advanced system of calculation known to date - preceding Greek methods by 2,000 years.
In c. 1050–771 BC, the south-pointing chariot was invented in ancient China. It was the first known geared mechanism to use a differential gear, which was later used in analog computers. The Chinese also invented a more sophisticated abacus from around the 2nd century BC known as the Chinese abacus.
In the 5th century BC in ancient India, the grammarian Pāṇini formulated the grammar of Sanskrit in 3959 rules known as the Ashtadhyayi which was highly systematized and technical. Panini used metarules, transformations and recursions.[7]
In the 3rd century BC, Archimedes used the mechanical principle of balance (see Archimedes Palimpsest#Mathematical content) to calculate mathematical problems, such as the number of grains of sand in the universe (The sand reckoner), which also required a recursive notation for numbers (e.g., the myriad myriad).
Around 200 BC the development of gears had made it possible to create devices in which the positions of wheels would correspond to positions of astronomical objects. By about 100 AD Hero of Alexandria had described an odometer-like device that could be driven automatically and could effectively count in digital form.[8] But it was not until the 1600s that mechanical devices for digital computation appear to have actually been built.
The Antikythera mechanism is believed to be the earliest known mechanical analog computer.[9] It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to circa 100 BC.
The Russian abacus, the schoty (Russian: счёты, plural from Russian: счёт, counting), was one of the earliest abacuses ever created. It usually had a single slanted deck, with ten beads on each wire (except one wire, usually positioned near the user, with four beads for quarter-ruble fractions). Older models have another 4-bead wire for quarter-kopeks, which were minted until 1916. The Russian abacus is often used vertically, with each wire from left to right like lines in a book. The wires are usually bowed to bulge upward in the center, to keep the beads pinned to either of the two sides. It is cleared when all the beads are moved to the right. During manipulation, beads are moved to the left.
Mechanical analog computer devices appeared again a thousand years later in the medieval Islamic world and were developed by Muslim astronomers, such as the mechanical geared astrolabe by Abū Rayhān al-Bīrūnī,[10] and the torquetum by Jabir ibn Aflah.[11] According to Simon Singh, Muslim mathematicians also made important advances in cryptography, such as the development of cryptanalysis and frequency analysis by Alkindus.[12][13] Programmable machines were also invented by Muslim engineers, such as the automatic flute player by the Banū Mūsā brothers,[14] and Al-Jazari's humanoid robots and castle clock, which is considered to be the first programmable analog computer.[15]
During the Middle Ages, several European philosophers made attempts to produce analog computer devices. Influenced by the Arabs and Scholasticism, Majorcan philosopher Ramon Llull (1232–1315) devoted a great part of his life to defining and designing several logical machines that, by combining simple and undeniable philosophical truths, could produce all possible knowledge. These machines were never actually built, as they were more of a thought experiment to produce new knowledge in systematic ways; although they could make simple logical operations, they still needed a human being for the interpretation of results. Moreover, they lacked a versatile architecture, each machine serving only very concrete purposes. In spite of this, Llull's work had a strong influence on Gottfried Leibniz (early 18th century), who developed his ideas further, and built several calculating tools using them.
Indeed, when John Napier discovered logarithms for computational purposes in the early 17th century, there followed a period of considerable progress by inventors and scientists in making calculating tools. The apex of this early era of formal computing can be seen in the difference engine and its successor the analytical engine (which was never completely constructed but was designed in detail), both by Charles Babbage. The analytical engine combined concepts from his work and that of others to create a device that if constructed as designed would have possessed many properties of a modern electronic computer. These properties include features such as an internal "scratch memory" equivalent to RAM, multiple forms of output including a bell, a graph-plotter, and simple printer, and a programmable input-output "hard" memory of punch cards which it could modify as well as read. The key advancement which Babbage's devices possessed beyond those created before his was that each component of the device was independent of the rest of the machine, much like the components of a modern electronic computer. This was a fundamental shift in thought; previous computational devices served only a single purpose, but had to be at best disassembled and reconfigured to solve a new problem. Babbage's devices could be reprogramed to solve new problems by the entry of new data, and act upon previous calculations within the same series of instructions. Ada Lovelace took this concept one step further, by creating a program for the analytical engine to calculate Bernoulli numbers, a complex calculation requiring a recursive algorithm. This is considered to be the first example of a true computer program, a series of instructions that act upon data not known in full until the program is run. Following Babbage, although unaware of his earlier work, Percy Ludgate in 1909 published the 2nd of the only two designs for mechanical analytical engines in history.[16]
Several examples of analog computation survived into recent times. A planimeter is a device which does integrals, using distance as the analog quantity. Until the 1980s, HVAC systems used air both as the analog quantity and the controlling element. Unlike modern digital computers, analog computers are not very flexible, and need to be reconfigured (i.e., reprogrammed) manually to switch them from working on one problem to another. Analog computers had an advantage over early digital computers in that they could be used to solve complex problems using behavioral analogues while the earliest attempts at digital computers were quite limited.
Since computers were rare in this era, the solutions were often hard-coded into paper forms such as nomograms,[17] which could then produce analog solutions to these problems, such as the distribution of pressures and temperatures in a heating system.
Digital electronic computers
The “brain” [computer] may one day come down to our level [of the common people] and help with our income-tax and book-keeping calculations. But this is speculation and there is no sign of it so far.
None of the early computational devices were really computers in the modern sense, and it took considerable advancement in mathematics and theory before the first modern computers could be designed.
The first recorded idea of using digital electronics for computing was the 1931 paper "The Use of Thyratrons for High Speed Automatic Counting of Physical Phenomena" by C. E. Wynn-Williams.[19] From 1934 to 1936, NEC engineer Akira Nakashima published a series of papers introducing switching circuit theory, using digital electronics for Boolean algebraic operations,[20][21][22] influencing Claude Shannon's seminal 1938 paper "A Symbolic Analysis of Relay and Switching Circuits".[23]
The 1937 Atanasoff–Berry computer design was the first digital electronic computer, though it was not programmable. The Z3 computer, built by German inventor Konrad Zuse in 1941, was the first programmable, fully automatic computing machine, but it was not electronic.
Alan Turing modeled computation in terms of a one-dimensional storage tape, leading to the idea of the Turing machine and Turing-complete programming systems.
During World War II, ballistics computing was done by women, who were hired as "computers." The term computer remained one that referred to mostly women (now seen as "operator") until 1945, after which it took on the modern definition of machinery it presently holds.[24]
The ENIAC (Electronic Numerical Integrator And Computer) was the first electronic general-purpose computer, announced to the public in 1946. It was Turing-complete, digital, and capable of being reprogrammed to solve a full range of computing problems. Women implemented the programming for machines like the ENIAC, and men created the hardware.[24]
The Manchester Baby was the first electronic stored-program computer. It was built at the Victoria University of Manchester by Frederic C. Williams, Tom Kilburn and Geoff Tootill, and ran its first program on 21 June 1948.[25]
William Shockley, John Bardeen and Walter Brattain at Bell Labs invented the first working transistor, the point-contact transistor, in 1947, followed by the bipolar junction transistor in 1948.[26][27] At the University of Manchester in 1953, a team under the leadership of Tom Kilburn designed and built the first transistorized computer, called the Transistor Computer, a machine using the newly developed transistors instead of valves.[28] The first stored-program transistor computer was the ETL Mark III, developed by Japan's Electrotechnical Laboratory[29][30][31] from 1954[32] to 1956.[30] However, early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, which limited them to a number of specialised applications.[33]
In 1954, 95% of computers in service were being used for engineering and scientific purposes.[34]
Personal computers
The metal–oxide–silicon field-effect transistor (MOSFET), also known as the MOS transistor, was invented by Mohamed Atalla and Dawon Kahng at Bell Labs in 1959.[35][36] It was the first truly compact transistor that could be miniaturised and mass-produced for a wide range of uses.[33] The MOSFET made it possible to build high-density integrated circuit chips.[37][38] The MOSFET later led to the microcomputer revolution,[39] and became the driving force behind the computer revolution.[40][41] The MOSFET is the most widely used transistor in computers,[42][43] and is the fundamental building block of digital electronics.[44]
The MOS integrated circuit, first proposed by Mohamed Atalla in 1960,[33] led to the invention of the microprocessor.[45][46] The silicon-gate MOS integrated circuit was developed by Federico Faggin at Fairchild Semiconductor in 1968.[47] This led to the development of the first single-chip microprocessor, the Intel 4004.[45] It began with the "Busicom Project"[48] as Masatoshi Shima's three-chip CPU design in 1968,[49][48] before Sharp's Tadashi Sasaki conceived of a single-chip CPU design, which he discussed with Busicom and Intel in 1968.[50] The Intel 4004 was then developed as a single-chip microprocessor from 1969 to 1970, led by Intel's Federico Faggin, Marcian Hoff, and Stanley Mazor, and Busicom's Masatoshi Shima.[48] The chip was mainly designed and realized by Faggin, with his silicon-gate MOS technology.[45] The microprocessor led to the microcomputer revolution, with the development of the microcomputer, which would later be called the personal computer (PC).
Most early microprocessors, such as the Intel 8008 and Intel 8080, were 8-bit. Texas Instruments released the first fully 16-bit microprocessor, the TMS9900 processor, in June 1976.[51] They used the microprocessor in the TI-99/4 and TI-99/4A computers.
The 1980s brought about significant advances with microprocessor that greatly impacted the fields of engineering and other sciences. The Motorola 68000 microprocessor had a processing speed that was far superior to the other microprocessors being used at the time. Because of this, having a newer, faster microprocessor allowed for the newer microcomputers that came along after to be more efficient in the amount of computing they were able to do. This was evident in the 1983 release of the Apple Lisa. The Lisa was the first personal computer with a graphical user interface (GUI) that was sold commercially. It ran on the Motorola 68000 CPU and used both dual floppy disk drives and a 5 MB hard drive for storage. The machine also had 1MB of RAM used for running software from disk without rereading the disk persistently.[52] After the failure of the Lisa in terms of sales, Apple released its first Macintosh computer, still running on the Motorola 68000 microprocessor, but with only 128KB of RAM, one floppy drive, and no hard drive in order to lower the price.
In the late 1980s and early 1990s, we see more advancements with computers becoming more useful for actual computational purposes. In 1989, Apple released the Macintosh Portable, it weighed 7.3 kg (16 lb) and was extremely expensive, costing US$7,300. At launch it was one of the most powerful laptops available, but due to the price and weight, it was not met with great success, and was discontinued only two years later. That same year Intel introduced the Touchstone Delta supercomputer, which had 512 microprocessors. This technological advancement was very significant, as it was used as a model for some of the fastest multi-processor systems in the world. It was even used as a prototype for Caltech researchers, who used the model for projects like real time processing of satellite images and simulating molecular models for various fields of research.
Navigation and astronomy
Starting with known special cases, the calculation of logarithms and trigonometric functions can be performed by looking up numbers in a mathematical table, and interpolating between known cases. For small enough differences, this linear operation was accurate enough for use in navigation and astronomy in the Age of Exploration. The uses of interpolation have thrived in the past 500 years: by the twentieth century Leslie Comrie and W.J. Eckert systematized the use of interpolation in tables of numbers for punch card calculation.
Weather prediction
The numerical solution of differential equations, notably the Navier-Stokes equations was an important stimulus to computing, with Lewis Fry Richardson's numerical approach to solving differential equations. The first computerised weather forecast was performed in 1950 by a team composed of American meteorologists Jule Charney, Philip Thompson, Larry Gates, and Norwegian meteorologist Ragnar Fjørtoft, applied mathematician John von Neumann, and ENIAC programmer Klara Dan von Neumann.[53][54][55] To this day, some of the most powerful computer systems on Earth are used for weather forecasts.
Symbolic computations
By the late 1960s, computer systems could perform symbolic algebraic manipulations well enough to pass college-level calculus courses.
Important women and their contributions
Women are often underrepresented in STEM fields, when compared to their male counterparts.[56] However, there have been notable examples of women in the history of computing, such as:
- Ada Lovelace: wrote the addendum to Babbage's Analytical Machine. Detailing, in poetic style, the first computer algorithm; a description of exactly how The Analytical Machine should have worked based on its design.
- Grace Murray Hopper: a pioneer of computing. She worked alongside Howard H. Aiken on the IBM's Mark I. Hopper also came up with the term "debugging."
- Hedy Lamarr: invented a "frequency hopping" technology that was used by the Navy during World War II to control torpedoes via radio signals. This same technology is also used today in creating Bluetooth and Wi-Fi signals.
- Frances Elizabeth "Betty" Holberton: invented "breakpoints" which are mini pauses put into lines of computer code to help programmers easily detect, troubleshoot, and solve problems.
- Frances Elizabeth "Fran" Allen
- Karen Spärck Jones: responsible for "inverse document frequency" - a concept that is most commonly used by search engines.
- Margaret Hamilton: the director of the Software Engineering Division at MIT, which developed on-board flight software for the Apollo's Missions to Space.
- Barbara Liskov: developed the "Liskov substitution principle."
- Radia Perlman: invented the "Spanning Tree Protocol," a key network protocol used in Ethernet networks.
See also
- Algorithm
- Charles Babbage Institute - research center for history of computing at University of Minnesota
- Computing timelines category
- History of software
- IT History Society
- List of mathematicians
- List of pioneers in computer science
- Timeline of quantum computing
References
- "Digital Computing - Dictionary definition of Digital Computing | Encyclopedia.com: FREE online dictionary". www.encyclopedia.com. Retrieved 2017-09-11.
- "One-to-One Correspondence: 0.5". Victoria Department of Education and Early Childhood Development. Archived from the original on 20 November 2012.
- Ifrah, Georges (2000), The Universal History of Numbers: From prehistory to the invention of the computer., John Wiley and Sons, p. 48, ISBN 0-471-39340-1
- W., Weisstein, Eric. "3, 4, 5 Triangle". mathworld.wolfram.com. Retrieved 2017-09-11.
- Konrad Lorenz (1961). King Solomon's Ring. Translated by Marjorie Kerr Wilson. London: Methuen. ISBN 0-416-53860-6.
- "DIY: Enrico Fermi's Back of the Envelope Calculations".
- Sinha, A. C. (1978). "On the status of recursive rules in transformational grammar". Lingua. 44 (2–3): 169–218. doi:10.1016/0024-3841(78)90076-1.
- Wolfram, Stephen (2002). A New Kind of Science. Wolfram Media, Inc. p. 1107. ISBN 1-57955-008-8.
- "Project Overview". The Antikythera Mechanism Research Project. Retrieved 2020-01-15.
- "Islam, Knowledge, and Science". University of Southern California. Archived from the original on 2008-01-19. Retrieved 2008-01-22.
- Lorch, R. P. (1976), "The Astronomical Instruments of Jabir ibn Aflah and the Torquetum", Centaurus, 20 (1): 11–34, Bibcode:1976Cent...20...11L, doi:10.1111/j.1600-0498.1976.tb00214.x
- Simon Singh, The Code Book, pp. 14-20
- "Al-Kindi, Cryptgraphy, Codebreaking and Ciphers". Retrieved 2007-01-12.
- Koetsier, Teun (2001), "On the prehistory of programmable machines: musical automata, looms, calculators", Mechanism and Machine Theory, Elsevier, 36 (5): 589–603, doi:10.1016/S0094-114X(01)00005-2..
- Ancient Discoveries, Episode 11: Ancient Robots, History Channel, archived from the original on March 1, 2014, retrieved 2008-09-06
- "Percy E. Ludgate Prize in Computer Science" (PDF). The John Gabriel Byrne Computer Science Collection. Retrieved 2020-01-15.
- Steinhaus, H. (1999). Mathematical Snapshots (3rd ed.). New York: Dover. pp. 92–95, p. 301.
- "Tutorial Guide to the EDSAC Simulator" (PDF). Retrieved 2020-01-15.
- Wynn-Williams, C. E. (July 2, 1931), "The Use of Thyratrons for High Speed Automatic Counting of Physical Phenomena", Proceedings of the Royal Society A, 132 (819): 295–310, Bibcode:1931RSPSA.132..295W, doi:10.1098/rspa.1931.0102
- History of Research on Switching Theory in Japan, IEEJ Transactions on Fundamentals and Materials, Vol. 124 (2004) No. 8, pp. 720-726, Institute of Electrical Engineers of Japan
- Switching Theory/Relay Circuit Network Theory/Theory of Logical Mathematics, IPSJ Computer Museum, Information Processing Society of Japan
- Radomir S. Stanković, Jaakko Astola (2008), Reprints from the Early Days of Information Sciences: TICSP Series On the Contributions of Akira Nakashima to Switching Theory, TICSP Series #40, Tampere International Center for Signal Processing, Tampere University of Technology
- Stanković, Radomir S.; Astola, Jaakko T.; Karpovsky, Mark G. "Some Historical Remarks on Switching Theory" (PDF). Tampere International Center for Signal Processing, Tampere University of Technology. CiteSeerX 10.1.1.66.1248.
- Light, Jennifer S. (July 1999). "When Computers Were Women". Technology and Culture. 40: 455–483.
- Enticknap, Nicholas (Summer 1998). "Computing's Golden Jubilee". Resurrection. The Computer Conservation Society (20). ISSN 0958-7403.
- Lee, Thomas H. (2003). The Design of CMOS Radio-Frequency Integrated Circuits (PDF). Cambridge University Press. ISBN 9781139643771.
- Puers, Robert; Baldi, Livio; Voorde, Marcel Van de; Nooten, Sebastiaan E. van (2017). Nanoelectronics: Materials, Devices, Applications, 2 Volumes. John Wiley & Sons. p. 14. ISBN 9783527340538.
- Lavington, Simon (1998), A History of Manchester Computers (2 ed.), Swindon: The British Computer Society, pp. 34–35
- Early Computers, Information Processing Society of Japan
- 【Electrotechnical Laboratory】 ETL Mark III Transistor-Based Computer, Information Processing Society of Japan
- Early Computers: Brief History, Information Processing Society of Japan
- Martin Fransman (1993), The Market and Beyond: Cooperation and Competition in Information Technology, page 19, Cambridge University Press
- Moskowitz, Sanford L. (2016). Advanced Materials Innovation: Managing Global Technology in the 21st century. John Wiley & Sons. pp. 165–167. ISBN 9780470508923.
- Ensmenger, Nathan (2010). The Computer Boys Take Over. p. 58. ISBN 978-0-262-05093-7.
- "1960 - Metal Oxide Semiconductor (MOS) Transistor Demonstrated". The Silicon Engine. Computer History Museum.
- Lojek, Bo (2007). History of Semiconductor Engineering. Springer Science & Business Media. pp. 321–3. ISBN 9783540342588.
- "Who Invented the Transistor?". Computer History Museum. 4 December 2013. Retrieved 20 July 2019.
- Hittinger, William C. (1973). "Metal-Oxide-Semiconductor Technology". Scientific American. 229 (2): 48–59. Bibcode:1973SciAm.229b..48H. doi:10.1038/scientificamerican0873-48. ISSN 0036-8733. JSTOR 24923169.
- Malmstadt, Howard V.; Enke, Christie G.; Crouch, Stanley R. (1994). Making the Right Connections: Microcomputers and Electronic Instrumentation. American Chemical Society. p. 389. ISBN 9780841228610.
The relative simplicity and low power requirements of MOSFETs have fostered today's microcomputer revolution.
- Fossum, Jerry G.; Trivedi, Vishal P. (2013). Fundamentals of Ultra-Thin-Body MOSFETs and FinFETs. Cambridge University Press. p. vii. ISBN 9781107434493.
- "Remarks by Director Iancu at the 2019 International Intellectual Property Conference". United States Patent and Trademark Office. June 10, 2019. Retrieved 20 July 2019.
- "Dawon Kahng". National Inventors Hall of Fame. Retrieved 27 June 2019.
- "Martin Atalla in Inventors Hall of Fame, 2009". Retrieved 21 June 2013.
- "Triumph of the MOS Transistor". YouTube. Computer History Museum. 6 August 2010. Retrieved 21 July 2019.
- "1971: Microprocessor Integrates CPU Function onto a Single Chip". Computer History Museum. Retrieved 22 July 2019.
- Colinge, Jean-Pierre; Greer, James C. (2016). Nanowire Transistors: Physics of Devices and Materials in One Dimension. Cambridge University Press. p. 2. ISBN 9781107052406.
- "1968: Silicon Gate Technology Developed for ICs". Computer History Museum. Retrieved 22 July 2019.
- Federico Faggin, The Making of the First Microprocessor, IEEE Solid-State Circuits Magazine, Winter 2009, IEEE Xplore
- Nigel Tout. "The Busicom 141-PF calculator and the Intel 4004 microprocessor". Retrieved November 15, 2009.
- Aspray, William (1994-05-25). "Oral-History: Tadashi Sasaki". Interview #211 for the Center for the History of Electrical Engineering. The Institute of Electrical and Electronics Engineers, Inc. Retrieved 2013-01-02.
- Conner, Stuart. "Stuart's TM 990 Series 16-bit Microcomputer Modules". www.stuartconner.me.uk. Retrieved 2017-09-05.
- "Computers | Timeline of Computer History | Computer History Museum". www.computerhistory.org. Retrieved 2017-09-05.
- Charney, Fjörtoft and von Neumann, 1950, Numerical Integration of the Barotropic Vorticity Equation Tellus, 2, 237-254
- Witman, Sarah (16 June 2017). "Meet the Computer Scientist You Should Thank For Your Smartphone's Weather App". Smithsonian. Retrieved 22 July 2017.
- Edwards, Paul N. (2010). A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming. The MIT Press. ISBN 978-0262013925. Retrieved 2020-01-15.
- Myers, Blanca (March 3, 2018). "Women and Minorities in Tech, By the Numbers". Wired.
External links
- The History of Computing by J.A.N. Lee
- "Things that Count: the rise and fall of calculators"
- The History of Computing Project
- SIG on Computers, Information and Society of the Society for the History of Technology
- The Modern History of Computing
- A Chronology of Digital Computing Machines (to 1952) by Mark Brader
- Bitsavers, an effort to capture, salvage, and archive historical computer software and manuals from minicomputers and mainframes of the 1950s, 60s, 70s, and 80s
- "All-Magnetic Logic Computer". Timeline of Innovations. SRI International. Developed at SRI International in 1961
- Stephen White's excellent computer history site (the above article is a modified version of his work, used with Permission)
- Soviet Digital Electronics Museum - a big collection of Soviet calculators, computers, computer mice and other devices
- Logarithmic timeline of greatest breakthroughs since start of computing era in 1623 by Jürgen Schmidhuber, from "The New AI: General & Sound & Relevant for Physics, In B. Goertzel and C. Pennachin, eds.: Artificial General Intelligence, p. 175-198, 2006."
- IEEE computer history timeline
- Computer History - a collection of articles by Bob Bemer
- A visual timeline of the development of computers since COLOSSUS' inception in 1943
- Computer Histories - An introductory course on the history of computing
British history links
- Resurrection Bulletin of the Computer Conservation Society (UK) 1990–2006
- The story of the Manchester Mark I (archive), 50th Anniversary website at the University of Manchester
- Richmond Arabian History of Computing Group Linking the Gulf and Europe