Tangled Threads and Skewed Strands
But parallelism is not the only force driving compilers to get more sophisticated.
"This is not on everyone's mind yet," Tarditi says, "but we're working to improve the reliability of software. Hackers are moving up the software food chain. We need language constructs and compiler support for more reliable software." Phoenix and Singularity are both active testbeds for developing the tools for building more reliable software at Microsoft, particularly focusing on ahead-of-time compilation for typesafe languages. In Phoenix, the compiler and tools are tightly integrated with one another.
"What would it take," Tarditi asks, "to use, in practice, more modern languages for system programming?" Languages with type checking, array bounds checking, and the like. Microsoft is not only exploring this in-house, but it makes Singularity available to academic researchers for such basic research.
"A little farther out," Tarditi says, they're looking into typed assembly language. The sales pitch for Java when it first came out was Java applets in verifiable bytecode; what if you could have that trustworthiness in distributed native machine code? "The next three to four years is my near term," Tarditi points out. "This might be farther out than five years."
Microsoft is serious about tackling reliability at the compiler and tools and OS level, but it is not alone in recognizing the seriousness of the problem. "At Intel," Reinders says, "I see how reliable silicon needs to be, and it amazes me how far all software is from this level of quality. It is a problem begging to be solved."
Researchers at Microsoft and elsewhere are aggressively looking for solutions to the reliability problem as well as exploring compiler strategies for dealing with parallelism.
Some of this research is bringing products to market right now.
Flashback to MIT in 1990: The Alewife project, under the leadership of Professor Anent Agarwal, had as its goal to demonstrate that a parallel computer system could be made both scalable and easily programmable. The large-scale shared-memory supercomputer that the Alewife team built provided many insights about parallel programming that fed into Agarwal's RAW project, whose goal was "to provide performance that is comparable to that provided by scaling an existing architecture, but that can achieve orders of magnitude more performance for applications in which the compiler can discover and statically schedule fine-grain parallelism."
RAW led to the RAW Compiler Project, which took away all responsibility for optimization from the hardware and gave it to the compiler, making the compiler smarter about parallelism. All of this led Agarwal in turn to found a spin-off company, Tilera, which you will recall from the beginning of this article.
Interesting times lie ahead.