CART0669,CART0470

“Get a Personal Trainer for Your Computer!”©

NOTE:  Items highlighted in RED are defined elsewhere in this Glossary, while items highlighted in BLUE are site links for further information.

SCROLL DOWN THIS PAGE USING YOUR MOUSE WHEEL OR THE “ELEVATOR BUTTON” ON THE RIGHT SIDE OF THIS PAGE>>>>

COMPUTER: The simple definition

An electronic device (“a machine”);

That applies instructions (“programs”);

To data (stored or available to that machine);

At a high rate of speed.

Stripped to its essentials, the key to computers is that they can “store” data and that they are “programmable”.  Once the data is placed in the machine, it’s “stored” so you don’t have to “reinvent the wheel” by re-creating it or changing it each time you want to use it.  What makes computers unique, then, is their MEMORY.  Unlike most other devices, once you turn a computer off, it can retain your data.  Also, the machine can be endlessly programmed with different instructions about what to do with that data, from “what if” calculations on numbers to mailing lists applied to a form letter.

In addition, since the mid-1980s, computers have served an important secondary purpose, as a communication device.  The rise of the Internet, combined with the increase in computing power in general and the progression from dial-up telephones to cable, DSL and FIOS which has dramatically increased transmission speeds, has resulted in the ability of computers to communicate with each other, even for users who aren’t going to use their computer to create documents or solve complicated math problems.  The rise of e-mail  and texting as well as social networking sites such as Facebook and Twitter have created an entirely new world where computer users communicate with each other constantly and instantaneously, sharing data like documents, music and videos from their computers or even their smart phones. Services such as Skype have made it possible to converse using audio or even video over the Internet for free or at a very low cost.  This ability to control our own journey, know where we are going and how to get their efficiently through technology, led Steve Jobs in 1990 to describe a computer as “a bicycle for our minds”.  Great description!PC illustration

Colloquially, most people consider the combination of a monitor, keyboard, mouse and CPU (and maybe a printer) to be a “computer” (like the graphic to the right).  But, strictly speaking, that’s not true, that’s a “computer system”. Technically, only the  “tower” or “box” which is the CPU is the actual computer, the brain of the system).   [Everyone these days recognizes what a computer looks like, but for a stroll down memory lane from those of us who were there at the beginning of evolution, and the answer to the question “Who Invented the Computer” click HERE.]

More information (probably more than you want to know right now), only if you’re interested:

1.  So, who “invented” the computer? The answer to this question is much the same as the one for “who invented the Internet”.  The invention of the computer has been a “process” not a “point in time”.  While Alan Turing hypothesized the idea for a computing machine that could process software instructions, it took John von Neumann to add the concept of containing all of the instructions and data within the machine’s memory, then the governments to build the Eniac and other tube-based computers, after which the invention of the transistor and semiconductor (see Chips) made it possible for people like Bill Gates and Steve Jobs to mass produce computers and popularize them for the masses.  (For more, see Who Invented The Computer and Invention vs. Use.)

For the past 70 years or so, what’s inside the standard computer’s “box” has been based on the von Neumann (or “storage”) architectureVon Neumann photo  (no, not Jerry Seinfeld’s plump postman nemesis “Newman-n-n” in the 1990s sitcom Seinfeld, but HungarianNewman from Seinfeld with red-American mathematician John von Neumann [1903-1957], photo at right).  As a result of this architecture, almost all of todays computers have a main board, hard drive, memory, input and output, audio and video cards and an operating system.  (And yes, there are other architectures - the Harvard architecture, with which the CPU can both read an instruction and perform a data memory access at the same time, even without a cache.  Also, data flow, cellular, Non-Uniform Memory Access (“NUMA”) and reduction computers as well as quantum and even chemical computers.)  While the concept of stored information may seem universal and commonplace today, it wasn’t originally.  Even the original huge ENIAC computers ran perpetually; if shut down, they lost their data. 

2.  Aside from the classic desktop computer shown above, there are many other sizes and designs for computers:  For example, see laptops and tablets (and their growing list of subcategories, like netbooks and tablets/pads) for common types of portable computers.  On the larger sizes, there are minicomputers, mainframes, and supercomputers (powerful computers which can support multiple and parallel users simultaneously, capable of performing billions of instructions a second to calculate extremely complex calculations).  Then there are the specialty types of computers, such as ingestible, wearable, HMDs and HUDs, Google Glass, augmented, virtual and embedded computers, to name just a few. And Raspberry Pi and other mobile mini-computers.  Computers can be found in virtually everything these days, and the list is growing daily.  See the discussion about the Internet of Things - everything from your car to your toaster, your toilet to your pacemaker may have built-in computers, and many may be reporting information back to you or someone else, whether or not you are even aware of it.

3.  Of course, nothing is quite as simple as it seems:  How computers process instructions (called “programs”) can vary as well (see Programming for more.).   And computers can have different types of “operating systems” as well.  The “O/S” dictates how those “instructions” are processed by the machine.  PC, Apple, Unix and Linux computers are examples of different operating systems.  And, of course, the “instructions” can be written in a variety of ways, using different programs or even protocols (languages).  Finally, you may not believe this, but computers can only add, nothing else.

4.  Moreover, there are different types of system architecture, i.e. hardware and/or software design, so that how a particular machine can process instructions from Point A to Point B may be completely different, although the end result may look exactly the same on two different computers.

5.  And that’s just the softwareHardware-wise, early computers were analog, then they became digital and there are also some hybrid ones combining some of the advantages of both types.  

6.  Also, see CPU, below for more about what’s inside the computer’s main “box”.  And keyboards, cases and screens for more information about those peripheral hardware items, as well as a discussion of what makes a computer fastAlso, computers for seniors.

7.  THE FUTURE - ADVANCED STUFF - NEURAL COMPUTERS:  Moore’s law (that the number of transistors on an integrated circuit will double every two years) appears to be reaching it’s practical limit, because it will become harder to etch any more features on a silicon chip (see semiconductor) with a diameter smaller than 7 nanometers.  (Apparently there is an actual limit to “how many angels can dance on the head of a pin”.)  Of course, the smaller the chip, the more transistors can be packed onto it and the shorter the distance between them, therefore reducing the time it takes for electricity to flow between them, increasing the operating speed of the chip. While Intel is currently manufacturing 14nm chips and planning for 10nm ones soon, the 7nm limit will be reached and growth after that will necessarily be stalled unless development moves toward the atomic level.  A nanometer is one billionth of a meter, far smaller than a strand of hair. By comparison, a single strand of DNA has a diameter of 2.5 nanometers.   Companies like HP and IBM are pouring literally billions of dollars into new computing and chip materials to rethink computer hardware design into a future that may not include silicon chips, at least for supercomputers. UPDATE:  In July, 2015 IBM announced that, working in cooperation with GlobalFoundries, Samsung and the State University of NY, it has in fact created a superchip that will have transistors that measure 7nm.

How will the atomic level be achieved? In 2014, scientists at Northwestern Univ. discovered that a single laser can stop rotating molecules immediately in their tracks, so other applications can be harnessed, a major step toward building quantum computers.The use of graphene, carbon nanotubes (cylinders made of carbon atoms) and other materials, which can be scaled down to the atomic level, will someday be used to create functional quantum computers and cognitive computers (see AI) that mimic brain functionality. And Stanford University has developed a multi-layered “high rise” chip using nanotechnology, significantly increasing performance by building layers of processing on top of layers of memory, cutting down on the time and energy required to move informatin back and forth between memory and processing.

Neural Computers:  IBM has spent $3 billion to develop computers that mimic brain-like functionality through its SyNAPSE (“Systems of Neuromorphic Adaptive Plastic Scalable Electronics”) program, originated at DARPA, which built a “neural chip” with 10 billion neurons and 100 trillion synapses, but uses only 1 kilowatt of power.  IBM revealed a second-generation processor (“TrueNorth”) with 5.4 billion transistors woven onto an on-chip network of 4,096 neurosynaptic cores, 10 times more than the Synapse generation.  As opposed to the common von Neumann architecture (discussed above) where computations are made quickly, but in a serial fashion, neural chips approximate how the brain works, in that each “neurosymaptic core” has its own memory (“synapses”), a processor (“neuron”) and communication conduit (“axons”) which operate together in an event-driven fashion, using very little power and providing nuanced pattern recognition and other sensing capabilities, much the way a brain does.  Inevitably, these computers will benefit areas such as cancer research, weather prediction, scientific modeling and the like and may eventually filter down to the desktop and laptop computers and IoT, but years from now, if ever. 

Quantum Computers:  And D-Wave Systems, a Canadian company, has built a quantum computer that generates 512 qubits of computing power, currently housed at NASA’s Ames Research Center, and used by Google.  While Google is looking to synthesize approaches between QC methodologies, IBM is creating its own custom hardware for QCs.  And Microsoft is using building blocks called “topological qubits” as the basis for its own hardware stack, which it believes are more suited toward the real world applications, not just labs.  Generally, because QCs operate under highly controlled and isolated conditions for the most part, general purpose QCs are difficult to create and have a limited market, as QCs are useful for solving only a very narrow range of problems, thus are impractical for conventional computers.  But don’t get too excited yet:  Prof. Peter McMahon of Stanford, quantum cryptology researcher, estimates that the magic number of qubits needed to achieve superfast calculations is between 10 and 100 million, and that’s still going to be years, even decades, away.  Nevertheless, fearing that quantum computers will break all current cryptography, Google is already testing out new “post-quantum cryptography” that will secure communications.  See also, quantum computers, (but click HERE for current progress).  mainframes to desktops, desktops to laptops, laptops to netbooks, netbooks to pads.  Now a new generation of popular computing devices is evolving, that used in HUDS, augmented reality and Google Glass, pushed by the emergence of the IoT.  Cars with displays superimposed on windshields (like those in Jaguar and BMW), and devices, possibly even full computers, with virtual displays may very well become the norm in just a few years.

8.  Did you know that computers aren’t really that intelligent?  Sounds like one of those Geico commercials, but it’s true. When it comes to calculations, they really do only one thing - add.  But they do it really, really fast, so you think they’re doing much more.  Click HERE for an explanation.

CLICK HERE TO GO BACK TO GLOSSARY “C”

CLICK TO SHARE THIS PAGE

CLICK THE FIRST LETTER OF YOUR ACRONYM OR TERM TO SEARCH GLOSSARY:

A

B

C

D

E

F

G

H

I

J

K

L

M

N

O

P

Q

R

S

T

U

V

W

X

Y

Z