Inexpensive consumer cameras may soon be shooting in the third dimension.

Optics researchers at the University of Arizona have developed a novel way to extract depth information from a scene using a conventional camera and a portable projection system similar to a camera's external flash.

Gabriel Birch, an optics graduate student and lead researcher for the project, said the camera system could enable photographers to shoot three-dimensional images directly on their digital cameras.

He said the technology currently requires a small amount of refinement in an editing program to produce high-quality, three-dimensional images.

However, he said the UA team of researchers is designing camera software that will allow the system to capture three-dimensional images in real time.

"Our system could really change how researchers, medical professionals and the average consumer take photos in 3D," he said.

Birch and optics professors Scott Tyo and Jim Schwiegerling published their work in the March 12 edition of the online optics journal, Optics Express.

The UA team uses a projector fitted with a specially designed lens to encode depth information onto an object they want to photograph.

Then by using an algorithm, or custom series of computer instructions, they extract the necessary information to create a three-dimensional model of a scene.

"The algorithm is designed to extract the particular information we are projecting," Tyo said. "So we have this enormous advantage - we know what we are looking for because we put it there."

Tyo said their system is fundamentally different from consumer cameras and 3D-imaging medical equipment for a number of other reasons as well.

First, he said their system requires only one camera lens.

Conventional 3D cameras use two lenses to photograph a single object from two perspectives, similar to how human eyes create a sense of depth.

"These cameras work well for shooting in 3D but compromise 2D image resolution," he said.

Tyo said one of the advantages of their system is that it does not hamper a photographer's ability to shoot high-quality two-dimensional photos.

Plus, it requires only an inexpensive consumer camera and the projector system. He said other single-lens systems that shoot in 3D require large amounts of hardware and multiple cameras and light sources.

"The real advantage of our technology would be for someone like you or me," Tyo said. "An average consumer who might think it would be cool if I could get a 3D shot of my kid's face with the camera I already have."

Birch demonstrated how the process works in the lab. He projected a gridlike pattern of horizontal and vertical lines onto a scene of objects of varying shape and size.

He said the projected vertical lines are in focus close to the camera while the projected horizontal lines are in focus farther away.

He took two pictures, one with the projected pattern and one without it. Then, using specially designed post-production software, he calibrated a function of depth from the two images.

The picture he eventually pulled up on a computer screen showed the objects in high-quality 3D.

The system can currently take three-dimensional images a little over three feet from the camera. However, Birch said, with slight modifications researchers hope to double the distance.

Birch said the next steps will make the system consumer-friendly, portable and capable of imaging objects farther away.

The researchers also are working to perfect the setup and image processing for use with video capture.

Schwiegerling said the team is looking for a corporate partner to help fund the research. One of the real challenges so far has been a lack of funding.

"A solid funding source will help us get the ball rolling," he said.

"Our system could really change how researchers, medical professionals and the average consumer take photos in 3D."

Gabriel Birch,

UA optics graduate student

Will Ferguson is a NASA space grant intern.