5 Futuristic Trends in Supercomputing

Watch this video about Intel's microprocessor size on HowStuffWorks. Science gurus Adam Savage and Jamie Hyneman show how incredible shrinking transistors are helping to cram old, super-sized supercomputer performance into small, sleek machines.
PodTech Networks

As our devices become more Web-ready and we find ourselves moving away from the personal computing paradigm, most of us imagine "supercomputers" as those great big funny Cray and IBM supercomputers the Baby Boomers dreamed about, with a million blinking lights and cranks and levers. But all over the world, giant parallel systems -- sometimes looking a bit more like the old supercomputers than you might think! -- are still being developed.

Most of us are familiar with Moore's Law, which says at its most basic that computer chips will double in power every 18 to 24 months. It's easy to forget that this doesn't just apply to our laptops and desktops. All of our electronic devices benefit from the same improvement cycle: processing speed, sensor sensitivity, memory and even the pixels in your camera or phone. But chips can only get so small and powerful before certain effects due to quantum mechanics kick in, and some experts say this trend -- which has rung surprisingly true over the last 50 years -- may slow down over the next decade as we get closer to the actual limit of what our current materials can do [source: Peckham].

Our phones and tablets may be the result of shrinking awesome amounts of computing power down to something you can take to the beach, but we're only seeing the face of all that data. Behind the scenes, the "cloud" requires more and faster information and computation than ever before, rising just as steadily as the quality and amount of data we're enjoying on our side of the screen. From the high-def movies we stream, to the weather, traffic and other satellite info we already use throughout our days, the future lies in plain old number-crunching. And that's what supercomputers do best.