UCLA Engineering Part of a New NSF “Expeditions in Computing” Team

Apr 4, 2012

By UCLA Samueli Newsroom

By Wileen Wong Kromhout

The UCLA Henry Samueli School of Engineering and Applied Science is part of a team of researchers just awarded $10 million by the National Science Foundation (NSF) to make computer programming faster, easier and more intuitive. Named ExCAPE (Expeditions in Computer Augmented Program Engineering), the project is part of NSF’s “Expeditions in Computing” program which funds teams with ambitious, fundamental research agendas in computer science.

The five-year project is a highly collaborative effort that will involve multiple research institutions, partners in industry, and include an educational outreach component. The University of Pennsylvania will lead the project with a team of researchers from UCLA, UC Berkeley, Cornell, the University of Illinois at Urbana-Champaign, the University of Maryland, MIT, the University of Michigan and Rice University.

In today’s programming languages, programmers must write out explicit instructions for what they want the program to do. For large projects, this kind of coding is so complex that programmers need separate “verification” teams to weed out errors. This technology, in the last 20 years, has lead to powerful analysis tools that can find subtle mistakes in real-world systems. The ExCAPE team plans to leverage these advances and help programmers avoid such mistakes in the first place.

“Computers have evolved at a dramatic pace, but technology that’s used to develop programs and software is evolving comparatively slowly,” said Rajeev Alur, professor of computer and information science at the University of Pennsylvania and lead investigator of the project.
ExCAPE has several different research threads and application areas. Common to all of them are the computational engines that will be used to synthesize software.

Paulo Tabuada, the only co-principal investigator from UCLA Engineering, will be collaborating with researchers on the team to improve existing computational engines as well as develop new ones.

“Of particular interest to me is the synthesis of software that is robust with respect to faults and failures that have not been modeled,” said Tabuada. “This type of robustness is especially important for safety critical systems that need to operate correctly for extensive periods of time under environments that are not completely known at design time.”

Tabuada was approached to be a part of the project because of the work conducted in his lab on Cyber-Physical Systems (CPS), systems which require tight integration between software and the physical world. Examples of CPSs include cars, unmanned aircraft, medical devices and very large scale systems like the power grid.

“We have been extending several techniques originally developed for the control of physical systems so that they apply also to the cyber components of CPSs. It turns out that these techniques also offer great potential for the synthesis of software only systems,” said Tabuada.

Currently, Tabuada has developed a few notions of robustness for software along with algorithms to synthesize robust software and to verify robustness of existing software. The next step is to take the results to the next level by making them easier to use for programmers and integrating them into programming languages and their compilers.

“We have an amazing team of very distinguished researchers that fathered many of the techniques that are used today in the verification of software systems,” said Tabuada. “It is very exciting to be part of this team as we make the transition from verification to synthesis. “

The researchers are proposing an integrated toolkit for automated program synthesis. Such a toolkit would allow a programmer to collaborate with a computer on writing a program, with each contributing the parts they are most suited to. With more powerful and integrated verification systems, the computer would be able to give feedback about errors in the program, and even propose corrections.

Rather than having programmers spend their time on small details, they will be able to leave these details to the synthesis engines and will instead be able to specify high-level goals and provide additional requirements until the desired code can be produced by the synthesizer.

“There will be a dramatic increase in productivity as well as in the complexity of the software systems we will be able to design,” said Tabuada. “In this way, the human programmer can shape the software by focusing on the tasks that are easier for humans while the more laborious work – and error-prone from human perspective – is left to the computer.”

Other challenge areas involve figuring out how to set up routing policies for the flow of information across networks of computers, how software written to work on single processor computers can be correctly translated to improve performance while running on multiple cores that are now common even in mobile devices, and how to design efficient and correct algorithms for coordinating decisions among multiple computers.

The team hopes that the toolkit created will change public perception of computer science, and eventually, how it and other sciences are taught. They would like to show that programming does not mean tedious coding according to strict rules and guidelines but is more about being able to express computational strategies.

Share this article