Flashback: Fortran, Parallelism, and Optimizing Compilers
The emergence of multicore processors is only one of the challenges driving significant smartness advances in compiler technology, although the consensus of those we polled is that it is the most important. Reliability is another biggie; more about that later. In a sense, though, these challenges and the advances they are driving are nothing new.
It seems strange today to realize that the very idea of compiling code was once controversial. But the prevailing view some 50 years ago was that these automatic-programming gimmicks couldn't possibly be smart enough to do the job.
"To them," John Backus wrote in 1980"them" being the people he called the priesthood of early computing"it was obviously a foolish and arrogant dream to imagine that any mechanical process could possibly perform the mysterious feats of invention required to write an efficient program." ("Programming in America in the 1950s: Some Personal Impressions," in A History of Computing in the Twentieth Century, Metropolis, 1980.)
Although Grace Hopper is rightly credited as the inventor of the compiler, when John Backus and his team invented Fortran they challenged conventional wisdom on compilation, articulated the bottom-line case for compilers, and came up with many key compiler optimizations in use today, such as moving computations out of inner loops, dead code elimination, and using expected frequency of execution estimates to optimize register allocation.
But "our plans and efforts," he wrote, "were regarded with a mixture of indifference and scorn until the final checking out of the compiler..."
The scenario of processors hitting a speed limit and a solution being offered in the form of multiprocessor machines with parallel-savvy compilers is not new, either. That's what Illiac IV was all about, with its 256 processors and massive parallelism supported by several parallel-optimized Fortran compilers. In the late 1970s, for problems that could be parallelized, Illiac was the fastest computer in the world. It did have a little problem with reliability, which, as mentioned, is another big issue today.
From the earliest Fortran compilers, developers were working to make them smart, to take up more of the slack from programmers. Today, competition drives compilers toward more smarts. "A landmark in compiler technology," Reinders says, "was the global optimizer in GCC 4.0 [in 2005]. It marked the end of the era of 'nonoptimizing compilers.' Every compiler needs to be very complex to compete these days."
In 1986, one programmer summed up the contemporary view of compilers: "People still get great satisfaction out of the fact that a compiler...can't write code as well as a human being. But...I think that within the next five years we'll have tools that will be able to do as good a job as a human programmer." (Bill Gates, Programmers at Work, Microsoft Press, 1986.)