BERKELEY, Calif. – The post-lunch session began with David DeWitt, a former University of Wisconsin professor who heads Microsoft’s new Jim Gray Systems Lab in Madison that was announced last month.
DeWitt described how Gray developed the standard benchmarking approach for measuring the speed and economy of transaction processing systems.
DeWitt had actually developed a different benchmarking approach earlier, but Gray “told me I had it all wrong and did his own. This is how computer science works.”
Gray’s scheme appeared in an anonymous paper titled “A Measure of Transaction Processing Power” that was published in the Datamation trade magazine in 1985.
Gray’s showed through: He worked hard to be sure it was published on April 1, and the last line was “There are lies, damn lies and then there are performance measures.”
The beauty of Gray’s approach was how simple it was, DeWitt explained: It basically mimics a banking transaction done by an ATM or a bank teller.
It launched the benchmarking race among computer system vendors (here’s a 2002 story I wrote about TPC benchmarking and Microsoft’s efforts) and established rules to avoid cheating, DeWitt said.
More importantly, the standardized measurements gave hardware and software companies more incentive to innovate and improve the performance of their systems.
“Transactions are cheap because Jim designed a benchmark that changed the industry,” he said.
Next was Gray’s friend and co-worker Gordon Bell, another San Francisco-based computing pioneer in Microsoft’s research group.
Bell gave a funny and touching talk about ways that Gray could achieve immortality, through memorials, books, online avatars and by naming things such as new units of measurement after the man.
It’s poignant because Gray already achieved a sort of immortality through his huge contributions to computer science and the generation of people he mentored.
“Then of course there’s Gray matter – we’ve named a part of the brain after Jim,” Bell said.
Gray’s name could also be applied to algorithms, procedures, laws of computing or units of computing performance.
But the best may be a new paradigm for science that Gray was advocating late in his career, what Bell called the Gray paradigm for data exploration.
Science used to be empirical, describing natural phenomenon. Then in the last few hundred years it was theoretical, using models and generalization. In the last few decades computation has enabled scientists to simulate complex phenomena.
That’s led to a new approach in recent years that further mixes computer science with other sciences, unifying theory, experiment and simulation. It involves huge amounts of data captured by instruments and simulation, software to process it and information or knowledge that’s stored in computers.
In this new kind of science, “computer scientists really learn the scientists’ science and become a co-partner or a twin working on science overall.”
The next presentations were evidence of the success of this approach, with Gray at least: Cosmologist Alex Szalay related how they worked together to bring the Sloan Digital Sky Survey to the Web and Microsoft Research Curtis Wong demonstrated the Worldwide Telescope project that blends that sort of data to present a searchable, explorable 3D representation of space.
Then Ed Saade, who did sonar searches for Gray’s boat, explained the approach and how the data collected was given to the state of California to help build its underwater maps that are being shared online with researchers and the public.
Finally Jim Bellingham from the Monterey Bay Aquarium Research Institute, who was introduced to Gray by the UW’s Ed Lazowska, showed how Gray helped the organization build tools to model, analyze and share its exploration of ocean data.