From something old to something new, will just a byte of data do?
|By MARK OLLIG|
What is believed to be the world’s oldest computer is known as the “Antikythera Mechanism” and was made over 2,100 years ago by the Ancient Greeks.
This mechanical instrument was found in 1901 by sponge divers in the middle of the wreckage of a Roman cargo ship that sunk off the small island of Antikythera in 80 BC.
A mystery for many years, some even thought this brass object may have been extraterrestrial in design because of its detailed geometric shape along with 37 complex gear wheels.
The Antikythera Mechanism has a differential gear, which was thought to have been invented in the 16th century.
Scientists now believe that the device was a complex and very accurate mechanical analog astronomical calculating computer that could predict the positions of the sun, moon, and planets, and even forecast lunar eclipses.
The Antikythera Mechanism, which has been estimated to be 12 inches tall, 8 inches wide, and 3 to 4 inches deep is the oldest known device that used gear wheels and is by far, the most sophisticated object from the Old World.
More information about the Antikythera Mechanism, along with new photographs, can be found at a web site dedicated to it: http://www.antikythera-mechanism.com.
Today’s personal computers certainly have come a long way in the last few years insofar as processor speed and memory, but are similar to the Antikythera Mechanism when compared to the world’s fastest supercomputer.
Let’s take a look at the latest version of the IBM, “BlueGene/L.”
The IBM BlueGene/L supercomputer is a “massively-parallel” computing system that consists of up to 131,072 individual central processing units (CPUs) and is located at Lawrence Livermore National Laboratory in Livermore Calif.
It’s interesting to note that on Sept. 29, 2004 IBM announced that a BlueGene/L prototype at the IBM facility in Rochester, Minn. had overtaken NEC’s Earth Simulator as the fastest computer in the world.
The Livermore laboratory is a US Department of Energy research facility that has been making breakthrough advances in computer and science engineering since 1952. The web site is at: http://www.llnl.gov/ and is operated by the University of California at Berkeley.
“BlueGene/L,” itself ,covers 2,500 square feet of floor space and has a storage memory of 32 Terabytes (32 trillion bytes). The computing performance has been reported to have a capability of 367 Teraflops. A Teraflop is one trillion floating point operations per second.
Our handheld calculators could be measured in flops. Each calculation, such as “5.2 * 2.7”, is a single floating point operation, or flop.
Today’s personal computers can perform calculations that exceed one gigaflop, or a billion floating point operations per second.
IBM’s next generation supercomputer is called “BlueGene/P.” It is expected to operate around one petaflop (1000 trillion calculations per second) and should be completed around 2008.
“The world’s top supercomputers can produce data at a rate equal to putting out one complete collection of the Library of Congress every few seconds,” said Bruce Goodwin, associate director for defense and nuclear technologies at Lawrence Livermore National Laboratory in California.
When I read that quote by Goodwin, I wondered how much computer data did, in fact, the Library of Congress hold? I looked it up and the answer is 10TB (10 terabytes or 10 trillion bytes).
And with that, I wanted to explore information about computer data storage so I visited (online) a respected university and found some interesting details I wanted to pass along to you.
A single computer decision, which is either a “1” or “0,” contains a single bit of computer memory data storage.
A single text character (8 bits) is one byte of data.
A short novel holds about 1MB (megabyte or million bytes).
The complete works of Shakespeare is equal to 5MB (megabytes) .
The collection of works by Beethoven holds 20GB (gigabytes or billion bytes) .
One TB (terabyte or trillion bytes) of data is what 50,000 trees made into paper and printed on are equal to.
The world’s largest active archive of weather data, The National Climactic Data Center database, holds 400 terabytes worth of data.
Hold on, there’s more . . .
All printed material in the world is estimated to contain 200PB ( petabytes or 200 quadrillion bytes or 200 thousand terabytes) worth of data.
All the words ever spoken by human beings is estimated to be 5EB (exabytes). An exabyte is approximately one quintillion bytes. “Exa” means one billion billion, or one quintillion.
An exabyte is broken down to be 1,152,921,504,606,846,976 bytes worth of computer data!
The next measurement of computer storage would be the zetabyte or ZB. This is equal to one sextillion bytes. If you were going to write the number, it would be this: 1,000,000,000,000,000,000,000 bytes.
One yottabyte or 1YB is equivalent to 1,024 ZB, or over a septillion bytes. If you had a yottabyte of disk storage, you would have what would be the equivalent to about a trillion 120 GB hard disks. That is a “1” followed by 24 zero’s.
There are two more, one is the brontobyte (or octillion bytes) which, written out, would be “1” followed by 27 zero’s. The gazzabyte (nonillion bytes) would be “1” followed by 30 zero’s.
The researchers at the University of California at Berkeley have continued to estimate each year how much information exists on our planet, Earth.
For those of you out there that love numbers and want to see more of these “Data Powers of 10” I have been talking about, browse over to this web link at the University of California at Berkeley: http;//www2.sims.berkely.edu/research/projects/how-much-info/datapowers.html and bring your calculator!