The history of the computer

Computers have a long history that started more than 200 years ago. In the 19th century, mathematicians and entrepreneurs came up with the idea of computers and created mechanical calculating machines to solve complex math problems. As technology advanced, computers became more complex and powerful in the early 20th century. Below, you will find a brief timeline of some significant events.

https://www.pexels.com/photo/black-and-white-photo-of-a-woman-using-a-vintage-computer-12950487/

In 1801, a French inventor named Joseph Marie Jacquard made a loom that could weave fabric designs using punched wooden cards. This idea was later used on early computers.

In 1821, an English mathematician named Charles Babbage thought of a machine that could calculate numbers using steam power. The British government funded the project, but it failed because the technology was not advanced enough at the time.

In 1848, an English mathematician named Ada Lovelace wrote the first computer program. She did this while translating a paper about Babbage’s Analytical Engine from French to English. Lovelace’s program was for computing Bernoulli numbers, which are used in math. This made her the world’s first computer programmer.

In 1853, Swedish inventor Per Georg Scheutz and his son Edvard created the world’s first printing calculator. This machine was significant because it could calculate tabular differences and print the results.

In 1890, Herman Hollerith developed a punch-card system to help calculate the 1890 U.S. Census. This machine saved the government several years of calculations and approximately $5 million for the U.S. taxpayer. Hollerith later established a company that eventually became IBM.

In 1931, Vannevar Bush invented and built the Differential Analyzer at MIT. This was the first large-scale automatic general-purpose mechanical analog computer.

In 1936, British scientist and mathematician Alan Turing presented the principle of a universal machine, later known as the Turing machine. This machine could compute anything that is computable, which forms the central concept of modern computers. In 1939, he started working full-time at Bletchley Park, where they decoded secret messages from Germany and its allies. Turing invented a machine called the Bombe, which made code-breaking easier. They were able to read German Air Force messages by the mid-1940s. Additionally, it has been estimated that creating this machine helped shorten the war by two to four years. After the war, Turing worked on designing the Automatic Computing Engine (ACE), which is considered the first modern computer. In 1949, he was appointed as the Deputy Director of the Computing Machine Laboratory at the Victoria University of Manchester, where he worked on software for one of the earliest stored-program computers, the Manchester Mark 1.

In 1937, John Vincent Atanasoff, a professor at Iowa State University, proposed building the first electric-only computer without using gears, cams, belts, or shafts.

In 1941, Konrad Zuse created the Z3 machine, which was the earliest digital computer in the world. It was destroyed during World War II, but Zuse later released the Z4 in 1950. Meanwhile, Atanasoff and Berry designed the Atanasoff-Berry Computer (ABC), which was the first digital electronic computer in the U.S. It could store information on its main memory and perform one operation every 15 seconds.

In 1945, John Mauchly and J. Presper Eckert built the Electronic Numerical Integrator and Calculator (ENIAC), which was the first automatic, general-purpose, electronic, decimal, digital computer.

In 1946, Mauchly and Presper departed from the University of Pennsylvania and obtained funding from the Census Bureau to construct the UNIVAC, which was the initial commercial computer designed for business and government applications.

In 1947, William Shockley, John Bardeen, and Walter Brattain of Bell Laboratories invented the transistor. They discovered the technique of creating an electric switch with solid materials, eliminating the need for a vacuum.

In 1949, a group at the University of Cambridge developed the Electronic Delay Storage Automatic Calculator (EDSAC), which was deemed “the first practical stored-program computer” by O’Regan. EDSAC executed its first program in May 1949, computing a table of squares and a list of prime numbers. In November 1949, scientists from the Council of Scientific and Industrial Research (CSIR), now known as CSIRO, constructed Australia’s first digital computer, the Council for Scientific and Industrial Research Automatic Computer (CSIRAC). According to O’Regan, CSIRAC was the first digital computer in the world to play music.

In 1953, Grace Hopper invented the first computer language called COBOL. It was for business and became famous. She was called the “First Lady of Software” later. Thomas Johnson Watson Jr. made the IBM 701 EDPM to help the United Nations during the Korean War.

In 1954, John Backus and his team created FORTRAN, a new programming language. It was specifically designed for math formulas.

In 1958, Jack Kilby and Robert Noyce made the computer chip. Kilby won the Nobel Prize in Physics for it.

In 1968, Douglas Engelbart showed a new computer at a conference. It had a mouse and a graphical user interface. This made computers easier for everyone to use.

In 1969, Ken Thompson, Dennis Ritchie, and a group of developers at Bell Labs created UNIX, an operating system that made large-scale networking of diverse computing systems and the internet practical. They continued to develop the operating system using the C programming language.

In 1970, Intel unveiled the Intel 1103, the first Dynamic Access Memory (DRAM) chip.

In 1971, a team of IBM engineers led by Alan Shugart invented the floppy disk, which enabled data to be shared among different computers.

In 1972, Ralph Baer released Magnavox Odyssey, the world’s first home game console, and Nolan Bushnell and Al Alcorn released Pong, the world’s first commercially successful video game.

In 1973, Robert Metcalfe developed Ethernet for connecting multiple computers and other hardware.

In 1975, the Altair 8080 was highlighted as the world’s first minicomputer kit to rival commercial models on the cover of Popular Electronics magazine. Paul Allen and Bill Gates offered to write software for the Altair using the new BASIC language and formed their own software company, Microsoft.

In 1976, Steve Jobs and Steve Wozniak co-founded Apple Computer and unveiled Apple I, the first computer with a single-circuit board and ROM.

In 1977, Radio Shack started making 3,000 TRS-80 Model 1 computers, which were nicknamed “Trash 80” and sold for $599. Within a year, they received 250,000 orders for the computer. Furthermore, the first West Coast Computer Faire took place in San Francisco. Steve Jobs and Steve Wozniak showcased the Apple II computer at the event, which had color graphics and an audio cassette drive for storage.

In 1978, VisiCalc introduced the first computerized spreadsheet program.

In 1979, MicroPro International released WordStar, the first commercially successful word processor. It was programmed by Rob Barnaby and consisted of 137,000 lines of code.

In 1981, IBM released their first personal computer, called “Acorn,” for $1,565. It used the MS-DOS operating system and had optional features like a display, printer, diskette drives, extra memory, and a game adapter.

In 1983, the Apple Lisa was the first personal computer with a GUI, drop-down menu, and icons. It is named after Steve Jobs’ daughter.

In 1984, the Apple Macintosh was introduced at a price of $2,500.

In 1985, Microsoft released Windows in response to the Apple Lisa’s GUI. In the meantime, Commodore announces the Amiga 1000.

In 1989, Tim Berners-Lee proposed the World Wide Web and HTML.

In 1993, the Pentium microprocessor improved graphics and music on PCs.

In 1995, two Stanford students, Larry Page and Sergey Brin, created a search engine called BackRub that used backlink analysis to track internet data. In addition, they developed PageRank, which ranked websites by counting pages and links. Furthermore, the students self-funded Google until receiving a $100,000 check from Sun Microsystems co-founder Andy Bechtolsheim. Google launched in 1999 and opened its first office in Menlo Park, CA, the same year.

In 1997, Microsoft invested $150 million in Apple, resolving a court case.

In 1999, Wi-Fi was developed, initially covering a distance of up to 91 meters.

In 2000, Microsoft developed the first tablet computer.

In 2001, Apple released a new operating system called Mac OS X, which was later renamed macOS. It replaced the old Mac Operating System and had 16 different versions, each with “10” in its title. The first nine versions were named after big cats, with the first one called “Cheetah.”

In 2003, AMD released the Athlon 64, the first 64-bit processor for personal computers. Moreover, WordPress and LinkedIn were established during this time.

In 2004, Mozilla Corporation launched Mozilla Firefox 1.0, a web browser that challenged Microsoft’s Internet Explorer. It was downloaded over a billion times in its first five years. There was also a shift towards Web 2.0, where people started actively participating in the internet rather than just consuming content. Facebook was also established by Mark Zuckerberg.

In 2005, Google bought Android, a mobile phone operating system based on Linux. Additionally, floppy disks were replaced with USB flashdrives. Furthermore, Google Analytics was established, and YouTube was launched as a video platform.

https://pixabay.com/photos/diskettes-floppy-disks-7484464/

In 2006, Apple released the MacBook Pro, its first Intel-based, dual-core mobile computer. Twitter was also launched for the public.

In 2009, Microsoft launched Windows 7, which had new features like pinning applications to the taskbar and easier previews of tiles.

In 2010, Apple unveiled the iPad, its flagship handheld tablet.

In 2011, Google released the Chromebook, which runs on Google Chrome OS. In addition, computer chips etched with a typical half-pitch (i.e., half the distance between identical features in an array) size of 22 nanometers entered mass production.

In 2012, quad-core smartphones and tablets were released, offering faster processing power.

In 2014, computer chips with a with a typical half-pitch (i.e., half the distance between identical features in an array) size of 14 nanometers were released. The market for smart watches also reached $5 million.

In 2015, Apple released the Apple Watch, and Microsoft released Windows 10.

https://www.pexels.com/photo/person-wearing-silver-aluminum-case-apple-watch-with-white-sport-band-5081424/

In 2016, the first reprogrammable quantum computer was created, which could program new algorithms into its system.

In 2017, DARPA is working on a program called “Molecular Informatics” that uses molecules as computers. They believe that chemistry has many properties that can be used for fast and scalable information storage and processing. Each molecule has a unique structure and characteristics, which allows for new and innovative ways to encode and process data beyond the traditional 0s and 1s of current digital architectures.

In 2023, OpenAI introduced GPT-4, which is a tool that can help users with various writing tasks, like creating songs, writing scripts, or understanding their unique writing style. Moreover, on May 30, 2023, the Center for AI Safety (CAIS) issued a statement on behalf of OpenAI, DeepMind, Turing Award laureates, and other artificial intelligence researchers, cautioning that their lifelong dedication to AI could potentially pose a threat to the survival of humanity.

Sources:
https://www.livescience.com/20718-computer-history.html
https://www.britishlegion.org.uk/stories/alan-turing-s-legacy-codebreaking-computing-and-turing-s-law
https://www.thestreet.com/technology/history-of-google-14820930#:~:text=An%20Early%20History%20of%20Google,record%20data%20on%20the%20internet.
https://www.complete-it.co.uk/the-history-of-information-technology/
https://www.computerhope.com/history/2023.htm
https://openai.com/gpt-4