5 Futuristic Trends in Supercomputing

Watch this video about Intel's microprocessor size on HowStuffWorks. Science gurus Adam Savage and Jamie Hyneman show how incredible shrinking transistors are helping to cram old, super-sized supercomputer performance into small, sleek machines.
PodTech Networks

As our devices become more Web-ready and we find ourselves moving away from the personal computing paradigm, most of us imagine "supercomputers" as those great big funny Cray and IBM supercomputers the Baby Boomers dreamed about, with a million blinking lights and cranks and levers. But all over the world, giant parallel systems -- sometimes looking a bit more like the old supercomputers than you might think! -- are still being developed.

Most of us are familiar with Moore's Law, which says at its most basic that computer chips will double in power every 18 to 24 months. It's easy to forget that this doesn't just apply to our laptops and desktops. All of our electronic devices benefit from the same improvement cycle: processing speed, sensor sensitivity, memory and even the pixels in your camera or phone. But chips can only get so small and powerful before certain effects due to quantum mechanics kick in, and some experts say this trend -- which has rung surprisingly true over the last 50 years -- may slow down over the next decade as we get closer to the actual limit of what our current materials can do [source: Peckham].

Advertisement

Our phones and tablets may be the result of shrinking awesome amounts of computing power down to something you can take to the beach, but we're only seeing the face of all that data. Behind the scenes, the "cloud" requires more and faster information and computation than ever before, rising just as steadily as the quality and amount of data we're enjoying on our side of the screen. From the high-def movies we stream, to the weather, traffic and other satellite info we already use throughout our days, the future lies in plain old number-crunching. And that's what supercomputers do best.

5: Exaflops and Beyond!

Miniaturization of chip components is only half the story. On the other side of the scale, you have the supercomputer: custom setups, built for power. In 2008, the IBM Roadrunner broke the one-petaflop barrier: one quadrillion operations per second [source: IBM]. (FLOPS stands for "floating-point operations per second," and it's the standard we use to talk about supercomputers used for scientific calculations, like the ones we're talking about in this article.)

Expressed in scientific notation, a petaflop is measured on a scale of 10^15 operations per second. An exaflop computer -- which experts predict will arrive by 2019 -- performs at 10^18, or a thousand times faster than the petaflop computers we're seeing now [source: HTNT]. For comparison's sake, as of June 2011, the 500 fastest supercomputers in the world combined would still have less than 60 petaflops of power. Continuing into the future, zettaflops improve on the same scale, giving us 10^21 operations per second by 2030, and then come the yottaflops, at 10^24 [source: TOP500].

Advertisement

But what do those numbers really mean? Well, for starters, it's believed that a complete simulation of the human brain will be possible by 2025, and within 10 years, zettaflop computers should be able to accurately predict the entire weather system two weeks in advance.

4: Green Supercomputers

CPU cooling fans like this are a common sight -- computers actually expend more than half of their energy just trying to manage their temperature. Supercomputer engineers are always on the quest for better, more efficient ways to keep things running cooler.
©iStockphoto/Thinkstock

All that power comes at a cost. If you've ever had a failed heat sink crash your desktop computer, or sat with a laptop actually in your lap for more than a few minutes, you know what that cost is: Computers use a lot of heat-producing energy. In fact, one of the major challenges for supercomputer developers is finding a sensible way to install and use the mighty machines without hardware failures or increasing damage to the earth. After all, one of the main uses of weather simulations will be to monitor carbon and temperature fluctuations, so it wouldn't be very smart to add to the very problem climatologists are trying to solve!

Any computer system is only as useful as its worst day, so keeping those hot circuits cool is of major interest. In fact, more than half of the energy used by supercomputers goes directly to cooling. But since the future of supercomputing is tied up with other vanguard sciences and futurist agendas, ecological concerns are already a matter of grave importance to high-performance computer engineers [source: Jana]. Green solutions and energy efficiency are parts of every supercomputer project underway, and in fact, Sequoia's efficiency was a large secondary reason the IBM supercomputer debuted to such fanfare.

Advertisement

From cooling with "free air" -- that is, engineering ways to get outside air into the system -- to hardware designs that maximize surface area, scientists try to be as innovative with efficiency as they are with speed. One of the most interesting ideas several teams are using is to run the system in a conducting liquid that picks up the heat, then gets piped through the structure housing the computer banks themselves. Heating water and rooms while simultaneously cooling the equipment is an idea that has many applications beyond just supercomputer sites, but the projects on TOP500's list are taking ideas like these very seriously indeed.

Addressing ecological and efficiency concerns is not only a good idea for our planet, but necessary in order to make the machines run. It's perhaps not the most glamorous application this research can promise the rest of us -- and you were promised futuristic trends!

3: The Artificial Brain

Let's talk about what happens in 2025 to 2030, when supercomputers are able to map the human brain [source: Shuey]. A scientist at Syracuse University estimated in 1996 that our brains have a memory capacity somewhere between one and 10 terabytes, probably around three [source: MOAH]. Of course, this wasn't a hugely useful comparison, since our brains don't work the same way computers do. But within the next 20 years, computers should be able to work the way our brains do!

In the same way supercomputers are already useful in mapping and affecting the human genome, producing solutions and predicting inherited medical issues and predilections, accurate models of the human brain will mean huge leaps in diagnosis, treatments, and our understanding of the complexities of human thought and emotion. In conjunction with imaging technology, doctors would be able to pinpoint trouble areas, simulate different forms of treatment, and even get to the root of many questions that have plagued us from the beginning of time. Implantable and graftable chips and other technology could help monitor and even shift levels of serotonin and other brain chemicals related to mood disorders, while major brain malfunctions and injuries could simply be reversed.

Advertisement

Beyond the medical advances this technology will help us reach, there's also the little matter of artificial intelligence (AI). While mid-performance computing power already gives us some powerful AI -- not to mention the intelligent systems already recommending customized selections of television, movies, books on AI algorithms, or the hours we spend chatting with Siri and similar virtual-intelligence programs -- a human-like level of "mental" complexity means applications for true AI. Imagine a Web MD that actually responds like a doctor, bringing expert-levels of attention to the questions you ask. And then, expand that concept beyond medical concerns, to virtual experts, explaining anything you need to know in a comfortable, conversational environment.

2: Weather Systems and Complex Models

As supercomputers become progressively more advanced, their ability to create predictive models of the weather will also ratchet up. Instead of waiting for weather events like this storm to appear and then analyzing the data related to them, we'l
©Stockbyte/Thinkstock

By 2030, it's hoped, zettaflop supercomputers should be able to accurately model the entire Earth's weather systems at least two weeks out [source: Thorpe]. We're talking about 99 to 100 percent accurate simulations of our entire planet and ecosphere, with local and global predictions available at the press of a button. Disasters aside, that may seem like a relatively specialized application -- beyond scheduling a wedding or vacation, when was the last time you needed to know the weather forecast weeks or months ahead? -- but consider the scale of information here.

Earth's climate is such a complex system that it's regularly invoked in the discussion of chaos theory, the greatest amount of complexity most of us can even comprehend: "Does the flap of a butterfly's wings in Brazil set off a tornado in Texas?" was the classic question first asked by Philip Merilees in 1972 (although the concept of meteorological complexity in this context goes all the way back to 1890 with Poincaré) [source: Wolchover]. It's hard to imagine a more complex system on a grander scale than that of our weather patterns and systems, when you think about it like that.

Advertisement

Food production and farming, weather's effects on other large-scale scientific projects (like a shuttle launches or polar expeditions, for example), and preemptive disaster relief are just some of the significant -- and lifesaving -- changes this kind of computing power gives us.

And of course, weather systems are just the tip of the icecap, here: If you can imagine perfectly simulated weather patterns, it's just a jump to proper modeling of any similarly huge and complex system. Whole planets and worlds could be realized in the silicon and electricity of tomorrow's supercomputers.

1: Simulated Worlds

Most of us are familiar with multiplayer gaming environments online, and can remember when artificial environments like "Second Life" were all the rage. Virtual realities have been a science-fiction hot button for at least a century. But when you put these ideas in the hopper with the capabilities of supercomputing right around the corner, gaming and role-playing environments become useful for a lot more than entertainment.

While there will no doubt be awe-inspiring development along the lines of "Second Life" and "The Matrix" -- and the cultural and societal changes that come with it -- the concept of data overlay in our daily lives is an even more exciting and useful application of all this power. Guided historical tours, dynamic GPS directions and online restaurant reviews already show this technology in its infancy. But imagine speeding up those weather-pattern simulations, putting in chaotic elements like human minds and behavior, and you can test theories of civil engineering, city planning and even food and resource inequalities.

Advertisement

Supercomputers won't have to guess at that information, although they'd be great at it: They can take in information from every possible source -- from the latest trending tweets to traffic patterns to energy grid usage -- and create real time models from them to regulate not only ongoing factors, but plan for the future. Rolling blackouts, gas shortages, even the gridlock around high-population events like the Olympics will be a thing of the past.

With pervasive WiFi Internet poised to take over the country and the world, providing the next generations past 4G connectivity, the truly powerful simulated world will one day soon be no different from the world that we already live in, only better. It will be more informed, more personalized and above all, empowering us as individuals and as a civilization to put that information to its best use. And all of this information and dynamic possibility will be brought to you by the performance power of the supercomputers we're only now bringing to life.

Lots More Information

Author's Note: 5 Futuristic Trends in Supercomputing

As a dedicated user of the cloud, I prefer to travel light. I personally only use about a gigabyte of my computer's hard drive, so that I can access my media and work projects from whichever device is most convenient. But in learning about how supercomputers are shifting the focus of computing power even further away from my phone and tablet, I was astonished to see just how many truly futuristic opportunities for the betterment of our lives are just within our grasp.

Related Articles

More Great Links

  • Daily Mail. "Get Ready For The Supercomputer That Can Predict the Future." Daily Mail. December 2011. (Aug 8, 2012) http://www.dailymail.co.uk/sciencetech/article-2069775/Get-ready-supercomputer-predict-future-EU-prepares-900m-funding.html
  • Graham, Susan L., Snir, Marc, and Patterson, Cynthia A. "Getting Up To Speed: The Future Of Supercomputing." National Research Council of the National Academies Committee On The Future Of Supercomputing, National Academies Press. 2005. (Aug 8, 2012) http://research.microsoft.com/en-us/um/people/blampson/72-cstb-supercomputing/72-cstb-supercomputing.pdf
  • IBM. "Roadrunner." White paper. June 2008. (Aug 8, 2012) http://www.ibm.com/ibm/ideasfromibm/us/roadrunner/20080609/index.shtml
  • Jana, Reena. "Green IT: Corporate Strategies." Business Week. February 2011. (Aug 8, 2012) http://www.businessweek.com/stories/2008-02-11/green-it-corporate-strategiesbusinessweek-business-news-stock-market-and-financial-advice
  • McMillan, Robert. "Intel Sees Exabucks in Supercomputing's Future." Wired. January 2012. (Aug 8, 2012) http://www.wired.com/wiredenterprise/2012/01/supercomputings-future
  • Museum Of American Heritage. "Technology." Last updated May 2010. (Aug 8, 2012) http://www.moah.org/exhibits/archives/brains/technology.html
  • Peckham, Matt. "The Collapse of Moore's Law: Physicist Says It's Already Happening." Time. May 2012. (Aug 8, 2012) http://techland.time.com/2012/05/01/the-collapse-of-moores-law-physicist-says-its-already-happening
  • Perry, Douglas. "You'd Need a 1 Petaflop to Score #20 Rank in Top 500 List." Tom's Hardware Technology Blog. July 2012. (Aug 8, 2012) http://www.tomshardware.com/news/supercomputer-top500-petaflop-servers-DOE,16042.html
  • Yonck, Richard. "The Supercomputer Race, Revisited." World Future Society Blogs. June 2011. (Aug 8, 2012) http://www.wfs.org/content/supercomputer-race-revisited

Advertisement

Loading...