CBR’s Gary Flood reports on the University of Southampton’s new supercomputer, claimed to be the fastest machine running Windows in Europe
What do you say to the man who's just crammed one of the world's fastest supercomputers into his 1970s-built basement data centre?
Answers on the figurative postcard should be dispatched to Oz Parchment, IT infrastructure services manager at the University of Southampton, who has helped oversee a £3m investment to get 'Iridis III' – officially the 74th fastest such system in the world top 500 of such high performance computing (HPC) arrays – up and running.
The 8,000 node system is in any case the fastest supercomputer at an English university (over 74 trillion calculations per second) as well as being claimed as one of the greenest, due to its low power consumption and water-cooled heat-handling environment. Parchment also contends his new baby is the fastest machine running the Windows operating system in the whole of Europe.
This isn't the first supercomputer at the university, or even in Parchment's basement. The clue is in the name: the university had two previous incarnations of Iridis (which is something of a neologism for a name; it mixes the Latin for 'rainbow' with a nod at Greek Goddess Iris, whose cloak shimmered with coloured lights) for a number of years, with previous systems based on either old-school mainframe (IBM 3090) or mini-era (SGI) clustering technology.
This version, which took two years from first being specified to going live, is based on the IBM iDataPlex server. It will give 20-30 times more power to academic users at the facility than the older iterations, a combined computing signature of the equivalent of 4,000 standard office PCs running at once. The beast also uses 100TB of storage (using the IBM DS4700 product). Running Windows HPC (High Performance Computing) Server 2008 R2 software, the supercomputer uses just 1 watt of power to generate speeds in excess of 299.52 megaflops of performance. There is also a large tape library, cloned and off-site for security, because Parchment sees his job as protecting “the data, not the system”.
Southampton has a number of users who are confident they can make use of such power and memory: from experimental designs for planes that have their engines over their wings instead of under them (thus being massively less noisy), accessing and digitally preserving the now crumbling 19th century wax cylinders first used to record sound on, simulation of beta blocker drug behaviour and many others.
On the plane front, for instance, the reason a supercomputer is useful is that designing an airframe means driving such horrendous calculations that designers tend to only use the one basic design, which is why all airliners look the same. Even with Iridis III, only modest variations are assumed to be even potentially possible.
Acid test time: is the system powerful enough for such loads? Parchment says that so far the load has never gone above 66% of the system's capacity. “I think we have about 20 to 30 times available what the researchers have currently called for,” he says. “We have no need I don't think to even consider extending the system for another 18 months.”
It also makes sense that Iridis is based where it is. Southampton is one of the UK’s leading teaching and research institutions, internationally recognised for its leading-edge research and scholarship across a wide range of subjects from engineering and science, to health, humanities and the social sciences. With its 22,000 students, 5,000 staff and equivalent annual turnover of over £370m, it is a serious organisation in its own right.
Southampton in fact says it sees the investment as the cost of doing business. “For a place like this, HPC is basic infrastructure. We're a research organisation,” says Parchment. “This is also key to the university's long-term strategy, and the sign-off went all the way to the top – this is very much supported at the highest level.”
In terms of the kit, the IBM modules – based on a 'half-depth form factor' that reduces the airflow required across the components, lowering the power needed for cooling but enabling the use of twice the number of servers in the same space as a standard 42u rack – were a sensible option considering the practical problems Parchment had to solve to create his new Iridis.
The Southampton deployment is one of the first UK installations of the top-end IBM iDataPlex. Released in mid-2008, the system is based on Intel Xeon (5500/5600) processors and packaged in a way Big Blue says offers unprecedented compute density, power and cooling efficiencies.
A key selling point is that the chassis allows both blade and racks to co-exist: all of the nodes inside the system ship in a 2U chassis with dedicated power supply and fans. A June 2009 report by the Standard Performance Evaluation Corporation rates the dx360 M2 version as the top rated server for both performance and energy efficiency in the entire x86 marketplace.
The cooling claim comes if the firm's Rear Door Heat eXchanger liquid cooler module is installed in the iDataPlex racks (as it is in Southampton), meaning typical outlet temperatures from the back of the system are up to 10 degrees lower than the inlet (ambient air) temperature. IBM is aiming the system at the academic and research supercomputer market but also compute-intense commercial applications, such as cloud computing service providers.
Power and cooling are familiar data centre management problems. “One of our key challenges was the physical limitations of the existing, 30 year old data centre,” says Parchment. “Any new installation just had to share the regular data centre facilities and space and we couldn't replace the entire air conditioning system again. Whatever we came up with had to be able to work within those constraints, especially on cooling.”
The iDataPlex has a minimal heat footprint in the data centre, which enables Southampton to avoid a massive cooling infrastructure around the HPC system. Specifically, Parchment told CBR he was appealed by the fact that the rear door of iDataPlex has a built-in heat exchanger which uses water to cool the expelled heat before it enters the room, making it both more environmentally friendly to use than standard AC but also meaning he could physically get enough computing power in the space available “without having the staff work in bikinis – and you wouldn't want that image if you'd have seen our staff.”
IDataPlex is built in such a way that it reduces the number of fans and power supplies needed to cool and power the components of the supercomputer and also uses that built-in water cooling system which is hailed as being very energy efficient. Right now the site has 400kW of cooling available but seems to be running fine on 220-230, he says, based on two big external chillers.
“Traditionally we have always banked on 30% of capital expenditure being put towards data centre cooling and other enhancements, but in practice it's turning out less then 10%,” he says.
All in all, the installation seems to be a bit of good news for one Higher Education organisation amongst a sector bracing itself for inevitable cuts in funding.
More fascinating insights into the installation can be found on the University of Southampton’s website at http://tinyurl.com/354mlzv.
Scores on the doors
The specifics of the Iridis III/IBM iDataPlex installation at Southampton, designed and built by HPC specialist VAR OCF are as follows:
• 1008 x IBM iDataPlex server nodes with Intel 2.26GHz processors
• 13 x IBM iDataPlex server racks with rear-door heat exchangers
• An InfiniBand switch network (32 x QLogic 36-port Leaf Switches and 4 x QLogic 36-port Core Switches) interconnects to all server nodes, combined with an Ethernet management network (26 x BLADE Network Technologies 48-port rack switches)
• 3 x IBM x3650m2 management/login server nodes
• 2 x IBM x3550 scheduling server nodes
• 10 x IBM x3550 service server nodes
• 2 x IBM x3550m2 Windows master server nodes
• 4 x IBM x3550 storage nodes
• 2 x IBM DS4700 storage controllers
• 8 x IBM EXP810 storage shelves
• 160 x IBM 1000GB SATA EV-DDM hard drive
• Storage is managed by IBM’s General Parallel File System while system management is provided by Cluster Resources’ Adaptive HPC Suite to allow users to switch between a Linux or a Windows workload.
A cure for cancer?
It’s one of the worst aspects of science ‘journalism’ – looking to see if every new piece of research will deliver that chimera, a single way to cure cancer or some other huge challenge.
In fact, what Iridis will be used for is helping in a range of complex problems, and not just in the engineering and computing departments but economics, psychology and even archaeology. Research teams at Southampton are using the system to manipulate massive datasets and run simulations to visualise what literally couldn’t be seen before. Applications include the development of new generations of super-quiet airliners, working out better ways for life-saving drugs to be absorbed and trying to computer-generate accurate pictures of ancient archaeological sites.
A senior lecturer in the University’s Archaeology Department told CBR his team is using Iridis 3 to create computer simulations of things like furniture and house interiors of Ancient Rome, to a level where “it’s a bit like taking a photograph of the past,” as well as trying to reconstruct the specific pigments of Classical statues - something that has never been convincingly achieved before.