San Jose, Calif. -- Government researchers are evaluating programming languages from Cray, IBM and Sun Microsystems as part of a broad effort to simplify the increasingly complex process of writing software for the world's most powerful computers. But some observers say the Defense Advanced Research Projects Agency does not have the right approach or enough support to drive the needed new direction in supercomputer software.
Experts generally praised Darpa's High Productivity Computing Systems (HPCS) program, which awarded nearly $500 million to Cray and IBM in November to build prototypes of petaflops-class systems by 2010, the goal being to make such systems easier to use (see story, page 8). A continuing evaluation of Cray's Chapel, IBM's X10 and Sun's Fortress languages aims to choose a lingua franca for programming such next-generation systems, which are expected to use tens of thousands of processors, sometimes of various flavors.
Sources said Darpa could pick one of the languages or a hybrid made up of parts from any two or three of the proposals. A decision could take 18 months.
Today's largest computers typically use the Message Passing Interface (MPI) libraries, developed by a committee of academic researchers a decade ago to link Cobol and Fortran programs across tens or hundreds of off-the-shelf processors ganged in large clusters.
"As systems become much larger, with hybrid CPUs, this just doesn't work well," said Jack Dongarra, a supercomputer researcher at the University of Tennessee who helped develop MPI. "MPI has just too much programming complexity to get it all right."
"We have to get away from message passing between the zillions of nodes that make up a system," said Jim Mitchell, a Sun research manager who has headed Sun's supercomputing efforts.
Cray, IBM and Sun aim to deliver languages that give programmers a higher-level view of complex systems as a uniform pool of memory they can access to let diverse processors communicate more easily with each other.
The languages are still in an early research phase, however, and one researcher who has had a hands-on look at them said that it's not clear whether they will adequately address looming problems in parallel programming in a vendor-neutral manner.
"They are still in a primitive state, and there is no evidence yet they will be embraced by the application community," said Marc Snir, a professor at the University of Illinois (Urbana-Champaign) who had a hand in MPI's development.
"The language is just one small component of the problem," added Snir. "You also need lots of libraries and tools for performance tuning, debugging and integration with existing libraries."