So...Who actually invented the FIRST computer? The short answer: No one. There was no first computer. Computers are a process, constantly evolving. Let’s take a stroll down memory lane...
Well (not to sound like a lawyer) it depends on how you define computer. For example, back in 2700 - 2300 B.C. the Sumerians invented the abacus, a handheld “machine” (still in existence today) which uses beads set into tables of successive columns which are then used to delimit successive orders of magnitude of their sexadecimal (base-60) number system. It was mechanical, not electric or electronic. Shortly after that, some 2,100 years ago, the Greeks invented the Antikytheria Mechanism, an extremely complex machine (run by a hand crank) which calculated the passage of time and movements of celestial bodies with great precision, arguably the first analog computer. [You just had to ask.]
Next came electronic machines such as the Victor calculator and the Monroe, an advanced programmable calculator which had a carriage like a typewriter, capable of solving simple equations [see below]. After that, the TI programmable calculators. And, in 1965, the Programma1, an Italian supercalculator of sorts, which many also considered the first desktop computer.
Subsequently, computers evolved in the 1940s into electronic and electro-mechanical machines which did more than just add and subtract - they performed mathematical calculations and other actions that were generally far too time consuming or complex for humans. The first programmable computer was Charles Babbage’s Analytical Machine. Westinghouse claims to have built the world’s first electric-powered computer in 1913 to transmit messages from the rail line to central command at Grand Central in a hidden room called M-42. (You know, the one Hitler tried to infiltrate by destroying the rotary converters which converted alternating current to direct, as trains were used to move troops.) The coded messages were printed out via ticker tape, and agents would use a wooden switchboard to call the operations department to let them know about switch problems or a stalled train.
Then came the age of the “big iron” machines. Perhaps the ENIAC (short for Electronic Numerical Integrator And Calculator) was the first general-purpose, electronic computer. It was a Turing-complete digital computer capable of being reprogrammed to solve a full range of computing problems. ENIAC was originally designed to calculate artillery firing tables for the U.S. Army's Ballistic Research laboratory, but its first actual use was in calculations for the hydrogen bomb. When ENIAC was announced in 1946 it was heralded in the press as the "Giant Brain". It boasted speeds one thousand. (Although, once it was turned off, it lost the data.) But it was huge, taking up its own building at 30 tons and 18,000 vacuum tubes. (Click HERE for more info.) About the same time, in 1943 the Brits at Bletchley Park built the Heath Robinson machine (named after a cartoon character who designed fantastic contraptions) and shortly thereafter the much-more-reliable Colossus (which read perforated paper tape to process information electro-mechanically), both of which helped in the WW2 war effort. Later, there was the pre-binary code Harwell Dekatron Wolverhampton Instrument for Teaching Computation (a/k/a “WITCH”), a 2.5 ton machine built in the 1950s used at the time for atomic research, mothballed in 1973 as binary code computers took over. The machine, which sounded like a noisy typewriter, stored up to 40 8-digit numbers (about as much as a pocket calculator) using 40 banks of 8 Dekatron valves (which looked like flashing Christmas light bulbs) and took about 5-10 seconds just to multiply just two numbers! The programs and data were entered using punched strips of tape (see photo). About the same time as ENIAC, the Brits also built EDSAC (Electronic Delay Storage Automatic Calculator) at the University of Cambridge, where it aided research into areas such a genetics, meteorology and X-ray crystallography. It’s design was later used to help create LEO, allegedly the world’s first business computer. I guess everyone wants to be first and, with no real definition for a computer, that title is up for grabs.
Moving forward, modern computers don’t take up an entire building, ushering in the age of far smaller hardware. And more reliable, as well. The old vacuum tube computers could be down half the time because every day some tubes burned out, taking with them the stored data when the computer went down, but the invention of transistors solved this problem. In this sense, the first “electronic” computer was actually invented back in 1973 by John Atanasoff and it was a scientific, not a personal computer [see below]. This according to the U.S. court system as documented in the book “The Man Who Invented the Computer” by Jane Smiley (2010).
Over the next forty years, the computer steadily got smaller and more powerful, reaching the age of the PC (“Personal Computer”) in the 1980s. [In the 1960s, DEC introduced the first “compact” computer, but it was priced at $125,000 without any software or peripherals, so we’re discounting that one.) At that point, “embedded” computers like those used in cars and appliances (the “IoT”) also appeared en mass. There are lots of claims to the inventor of the first PC. Contrary to popular belief, the PC was not invented by Bill Gates (who invented Windows, a graphic software interface which made computers easier to use) or Steve Jobs and Steve Wozniak (1976) (who invented the first true “personal” computer, which used a mouse and a graphic user interface (“GUI”) to make the entire process easier to use for non-scientists), although some have credited them. Also credited have been the Xerox Palo Alto Research Center’s ALTO computer (3/1/73; with mouse-driven graphic interface & movable overlapping “windows” bitmap graphics), Aegis (1981; with the Domain/OS, later DM [“display manager”]) and even IBM (1991). Many more people credit H. Edward Roberts, a mentor of Bill Gates, who wrote the BASIC programming language which actually ran the Altair (1975). The actual patent for a PC was issued in the U.S. on July 25, 1972 to Jack Frassanito, who in 1968 invented a self-contained unit with its own processor, display, keyboard, internal memory and mass storage of data. It was named the Datapoint 2200, cost about $5000 and had up to 8,000 bytes of internal storage with another 300,000 bytes on 2 cassette tapes.
In the 2000s, computers became even smaller, as “smart” cellular phones such as the iPhone and the Droid allowed the installation of various downloadable “apps,” actually tiny programs, which used to be executed on full-scale computers, now used over the Internet and shared with other cell phone users. Of course, following the rule that as a device becomes smaller it also becomes more costly, smart phones have gone up in price from about $150 to close to $1000. All this, while the costs of desktops have fallen drastically: For example, as discussed above the Programma1 cost $3500 (about $24,000 today), the top end IBM Portable cost $19,975 ($88,000 today), the Apple Lisa cost $9,975 in 1983 ($24,000 today), the Osborne at $2800 in 1985 ($6,200 today), the Macintosh portable at $6500 in 1989 ($12,500 today), even a 1999 Dell Dimension tower at $2,300 ( $3,400 today).
Presently, computers are moving toward “neural” machines using quantum technology, which mimic the human brain and have virtually billions of microprocessors built into silicon semiconductor chips. See Computers.
CONCLUSION: In short, it all depends on what you define as a computer. Manual/Electronic/electric/electromechanical/digital? Business/General Purpose/Scientific/Government? U.S./British/other? Tubes/transistors/Graphene/Silicon? Paper Tape/Hollerith Cards/Keyboards? Speeds? Drive Storage? Everyone has a claim to fame! And, even then, there’s no agreement. After I researched this, I found an even more subdivided series of definitions HERE.
For more milestones, see below: