UI scientists to work with Intel, Microsoft

By Peter Kim

University scientists will team up with computer technology giants Microsoft and Intel for a joint research project that will develop ways to increase the speed of computers by improving their processors.

Speeding up the processor, the brain of the computer, will allow for faster computing, smoother graphics and more complex programs.

“The clock speed (of processors) has been increasing very slowly in the last few years,” said Marc Snir, researcher and University professor of computer science.

“The only way to speed them up is to stack processors.”

Snir described the idea of stacking processors, commonly known as parallel processing, in terms of the processor in a laptop. The individual processors are not faster, he said. Rather, more processors allow for faster computing.

Vice Chancellor for Research Charles Zukoski said parallel processing has been primarily implemented in supercomputers.

“There are supercomputers that have over 100,000 processors,” Zukoski said.

However, Zukoski added that these supercomputers are not used for consumer purposes but are primarily used for large-scale scientific calculations, such as predicting the collision of galaxies.

Snir said the research is focused on readying these parallel processing techniques for transfer into consumer products. This would mean sharper graphics for video games, faster Internet downloading and similar upgrades to every sector of the consumer electronics market.

One application that seeks to gain much from parallel processing is speech recognition.

An example would be the “Babel Fish” from “The Hitchhiker’s Guide to the Galaxy,” Snir said. This is a device, when placed in a user’s ear, that translates any language that passes through it into English.

“Right now doing all that takes (a device) too long to compute, but with parallel processing, things like simultaneous translation are a possibility.” Snir said.

Parallel processing can increase the overall intelligence of a computer, he added. For example, it could enhance a computer’s natural language recognition.

An example of natural language recognition is Google’s analysis of text with its search engine. Whenever a search is typed into Google, the search engine analyzes the words and attempts to attach a syntactic role to each word to determine the larger overall meaning of the search.

“If you type in something with ‘Kennedy’ into Google, it doesn’t know what ‘Kennedy’ you mean,” Snir said. “Is it a president? Is it an expressway? Natural language recognition tries to figure what is the original meaning.”

Consumer products are not the only technology to benefit from parallel processing. Biomedical instruments, such as MRI, may also become faster and more efficient.

“Right now, when you get an MRI, it takes a great deal of time to go from the signals to an image. You can’t lie in an MRI and look at the image in real time,” Zukoski said. “With parallel processing, computers will be fast enough to construct that image instantaneously.”

Snir said the University is no newcomer to the computer processing arena. The University has more than 50 years of history in computer processing technology and has been involved in parallel processing since the 1960s.

The University’s first parallel processing project began with the Illinois Automatic Computer IV and continues up to its current involvement with Blue Waters, a large-scale supercomputer that will be used for various scientific purposes such as measuring global warming.

According to the University’s Engineering Web site, Microsoft and Intel will establish two research centers for parallel processing and fund the project with $18 million over five years. One will be located at the University of Illinois, and the other will be established at the University of California, Berkeley campus.

“This is the technology of our times. It’s about solving problems. … It’s about developing new knowledge and transferring it for use in society,” Zukoski said.