Shopping? Check out our latest product comparisons

Supercomputer

The DIY PC in a LEGO case, by Mike Schropp (Photo: Total Geekdom)

Most of the custom-built DIY PCs featuring unusual case mods are made just for fun or fashion. The LEGO-bodied PC by Mike Schropp is quite different, however, despite the fact it looks really impressive. It's a 12-core PC setup consisting of three systems in a single box made of LEGO, with its computing power being donated to medical research and humanitarian projects via IBM's World Community Grid project.  Read More

The IBM Blue Gene/P ('Intrepid') supercomputer (Photo: Argonne National Laboratory)

There’s a lot of scientific research projects out there that could produce some interesting results, if only they had access to a supercomputer. With that in mind, this week the US Department of Energy (DoE) announced that it has awarded 57 deserving projects with a total of almost 1.7 billion processor hours on two of its (and the world’s) most powerful computers. It’s part of the DoE’s cleverly-acronymed Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, the aim is of which is primarily “to further renewable energy solutions and understand of the environmental impacts of energy use.” That said, the program is open to all scientists in need of heavy-duty data crunching.  Read More

Fujitsu's new supercomputer is nicknamed the 'K', a reference to the Japanese word 'Kei,' ...

It wasn't so long ago that we reported on the Roadrunner supercomputer breaking the petaflop barrier. But this week Fujitsu announced that it will begin shipping its next-generation supercomputer which has a lofty performance goal of 10 petaflops – that's ten thousand trillion operations per second! The computer is nicknamed the 'K', a reference to the Japanese word "Kei," or 10 to the 16th power. If the K could reach this goal, it would hold the first place title – at least for a while – on top of the top 500 supercomputers list.  Read More

NeuFlow takes its inspiration from the mammalian visual system, mimicking its neural netwo...

The brain’s ability to quickly visually interpret our environment requires such an enormous number of computations it is pretty amazing that it accomplishes this feat so quickly and with seemingly little effort. Coming up with a computer-driven system that can mimic the human brain in visually recognizing objects, however, has proven difficult, but now Euginio Culurciello of Yale’s School of Engineering & Applied Sciences has developed a supercomputer based on the human visual system that operates more quickly and efficiently than ever before.  Read More

Prof. Ralph Eichler, President, ETH Zurich and Dr. John Kelly, Senior Vice President IBM R...

IBM has announced that its first-of-a-kind hot water-cooled supercomputer has been installed at the Swiss Federal Institute of Technology Zurich (ETH Zurich). Named the Aquasar, the system not only consumes up to 40 per cent less energy than an air-cooled machine but the direct utilization of waste heat in the building's heating system translates to an 85 per cent cut in carbon dioxide emissions.  Read More

Two of the microchannel hot water heat sinks, on a server blade from IBM/ETH's Aquasar sup...

It’s easy to think of the Internet as something that’s just “out there” in cyberspace, that doesn’t effect the physical world in any tangible way. In 2009, however, it was estimated that Internet data centers worldwide consumed about 2% of global electricity production. Not only did most of that electricity undoubtedly come from non-green sources, but it also cost the global economy approximately 30 billion US dollars. Much of the electricity was needed to power the data centers’ forced air cooling systems, that keep the servers from overheating. Now, researchers from IBM Zurich and the Swiss Federal Institute of Technology Zurich (ETH) have devised a much more efficient method for cooling the steamy Internet - they use hot water.  Read More

Our current “Standard Model” of cosmology (left), a model without dark energy, and a warm ...

Scientists have for some time postulated that "dark matter" could partially account for evidence of missing mass in the universe, while the hypothetical form of energy known as "dark energy" is the most popular way to explain recent observations that the universe appears to be expanding at an accelerating rate and accounts for 74 percent of the total mass-energy of the universe according to the standard model of cosmology. To better understand these two mysterious cosmic constituents scientists at the Los Alamos National Laboratory (LANL) are using Roadrunner, the world’s fastest supercomputer, to model one of the largest simulations of the distribution of matter in the universe.  Read More

Asus takes a break from 'cheap and cheerful' to produce a 1.1 teraflop desktop-sized compu...

Goodbye to the days when supercomputers had to fill a room and welcome Asus, purveyor of all things Eee and its first ever supercomputer - the ESC 1000. Produced in conjunction with NVIDIA and the National Chiao Tung university in Taiwan, the desktop-sized machine is capable of speeds up to a mighty 1.1 teraflops, which may pale in comparison to the petaflop Roadrunner, but then so does the footprint.  Read More

Visualization of an astrophysics simulation to discover the mechanism behind the violent d...

Capturing complex visualizations, such as the above Dali-esque rendering of a supernova, don’t just produce pretty pictures ideal for desktop wallpapers. They also allow scientists to see simulations of complex physical, chemical and biological phenomena. Unfortunately generating the quadrillions of data points required for visualizations of everything from supernovas to protein structures is quickly overwhelming current computing capabilities. So scientists at the U.S. Department of Energy’s (DOE) Argonne National Laboratory are exploring ways to speed up the process using a technique called software-based parallel volume rendering.  Read More

The Novo-G reconfigurable supercomputer (Photo: University of Florida)

Today's computers can carry on a wide range of tasks thanks to a general architecture that allows for great flexibility at the cost of a non-optimal performance; on the other end, application-specific integrated circuits (ASICs) can carry on a very specific task with great speed and energy efficiency, but are very inflexible. Now Novo-G, a reconfigurable supercomputer developed at the University of Florida that's described as the most powerful of its kind, is attempting to take the best from both worlds by being able to effectively change its hardware configuration as needed to compute with the greatest possible speed and efficiency.  Read More

Looking for something? Search our 27,814 articles