The project aims to offer cinematographers and their crew wireless access via hand-held devices to real-time information about equipment, light levels, and the location of lights, actors, props and important assets during filming. The same information, acquired and presented in real-time, could also be used to support continuity management in later setups of the same scene and for post-production integration with computer graphics. Despite recent advances in special effects, directors are often still limited by requirements of preset conditions when filming scenes integrated with computer-generated material.
ATC is part of a larger partnership between the professional schools led by film professor Fabian Wagmister and electrical engineering professor William Kaiser. The two co-direct the new Media, Performance and Engineering Research Center, currently in a development stage after being created in late 2004 by Deans Robert Rosen and Vijay K. Dhir.
By focusing sensor technology on cinematography issues such as lighting and cameras, sensor networks experts are developing a better understanding of how they can aid current film production, as well as how production may be transformed by emerging technologies that bridge the physical and digital worlds.
“Working with the School of Theater, Film and Television provides a unique opportunity for researchers in engineering,” noted Jeff Burke, a UCLA engineering alumnus and researcher in the Hypermedia Studio. “For one thing, the facilities and productions provide the engineers on this project with a real world environment for testing elements of the sensor network and developing interfaces that will be of use to cinematographers.”
The ATC collaboration, supported by a grant from the Intel Research Council, builds on a white paper developed in 1999 by Kaiser, Wagmister and Burke with cinematography professor William McDonald and electrical engineering professor Greg Pottie. It builds on collaborative experience at TFT’s HyperMedia Studio, including senior design courses that brought together students from engineering and theater, film and television, training them to apply lessons from other disciplines, and to develop a shared understanding of terms common to both fields that have very different meaning.
“This project brings arts and engineering back together. Academia has compartmentalized learning, creating distinct disciplines on campus,” said Pottie, who is also associate dean of research and physical resources for engineering at UCLA. “To build true understanding across the fields, we need to work together in new ways. And we’ve discovered that if we present our students with an opportunity to see engineering from another approach, they jump at it.”
Vanessa Holtgrewe, an MFA student in cinematography, brings both artistic and practical knowledge of film production to the team. She has worked extensively as a cinematographer and camera operator, and is currently on location in Boston as director of photography for TLC’s American Firefighter reality series. Soon after she joined the project, Holtgrewe ran a filmmaking “boot camp” for engineering students. They spent half a day on a sound stage, where she showed them the basics of set design and filming. All of the students had an opportunity to operate a video camera, set up and adjust the lighting and serve as actors.
“By working together, as we learn the basics of the other’s field,” explained Holtgrewe, “the questions we’re asking have become better defined.”
The project also brings together engineering research in software languages, wireless sensor networks, and embedded systems with the unique middleware and authoring tools being developed in TFT to make it easier for artists to experiment with new technologies.
UCLA engineers are adapting light sensors to a wireless platform developed in previous projects; they are also investigating the use of camera platforms being developed by the Center for Embedded Networked Sensing (CENS) at UCLA.
In addition to the use of commercial radio-frequency identification systems (RFID), students in Professor Mani Srivastava’s Networked and Embedded Systems Lab are adapting original active sensing platforms to support high-precision tags for localization of lights, people and objects. They are also building small sensor packages that can acquire precise information on color temperature and light intensity.
“To make a system that will be truly useful to cinematographers,” explained Jonathan Friedman, a graduate student researcher in NESL. “We need to develop a system that places reliability and function before the lifetime of the sensor. Ninety-percent reliability won’t be good enough. It has to be able to deliver all the data all the time.”
The UCLA team recently demonstrated the system for members of the American Society of Cinematographers, a professional society, and received valuable feedback on potential system features that would be of use to them. They then took the same demonstration to Baltimore, Maryland for the ACM Sensys 2004 conference.
“For the first time I can see that my research isn’t just theoretical,” said Friedman. “We have a really plausible application that’s close to being reality. With just a few modifications to existing systems, it will be mechanically and physically feasible.”
In the spring, the ATC team hopes to deploy a prototype system on an independent film to test the data gathering capabilities of the system in a professional sound stage environment.