New invention transforms any smartphone or TV display into a holographic projector
Scientists have developed a method for creating 3D holograms using "incoherent light" emitted from mobile devices — turning iPhone 14 Pro into a projector.
Researchers have created holograms using the light emitted from an ordinary smartphone screen — effectively turning an iPhone into a holographic projector.
Using a device called a spatial light modulator (SLM), scientists transformed a 2D image displayed on an iPhone 14 Pro into a 3D hologram. They detailed their findings in a study published April 2 in the journal Optics Letters.
The researchers employed a technique they called a "cascade of holograms," whereby the light from a static image is repeatedly modified to create a multi-layered, 3D image.
In the study, the cascade began with a static color image shown on an iPhone. Light waves emitted from it were refined through the SLM — a device used to control and adjust the phase (timing), amplitude (strength or brightness) and polarization (direction) of light waves. Using the SLM, scientists progressively refined and layered the light waves to build up the 3D image step-by-step.
To achieve the holographic effect, the scientists had to determine the specific adjustments of light needed to create the 3D hologram from the image displayed on the iPhone's screen.
This involved working backwards from the desired output to determine the specific adjustments needed in the light's phase and amplitude at each step of the journey, from the iPhone display through the SLM, to recreate the hologram accurately.
They captured images at two key points using a color image sensor. The first point was at the focal point of a Fourier transform lens (FTL) — a special type of optical lens designed to precisely focus light into clear images.
Sign up for the Live Science daily newsletter now
Get the world’s most fascinating discoveries delivered straight to your inbox.
Related: Charging EVs could take seconds with new sodium-ion battery tech
The second recording point was set 0.6 inches (1.5 centimeters) away from the focal point. This enabled the sensor to record variations in depth, demonstrating the holographic display's ability to project images in 3D.
This research is unique because it demonstrates how "incoherent" light from everyday devices like smartphones and laptops could be used to create holographic displays, the scientists said in the paper. Incoherent light refers to light sources without a consistent phase or wavelength.
Traditionally, computer-generated holography (CGH) requires "coherent" light sources such as lasers, which have a uniform phase and wavelength that are easier to precisely control. This makes them ideal for generating clear, high-resolution holograms.
However, lasers are expensive and potentially harmful to the eye, the researchers said, making them impractical in everyday scenarios. They can also introduce visual artifacts like "speckle noise" — random, grainy interference in images that can reduce visual quality and clarity.
"Our method does not use lasers, thereby eliminating speckle noise," lead study author Ryoichi Horisaki, associate professor at the University of Tokyo's Graduate School of Information Science and Technology, told Live Science.
Incoherent light is less suitable for holography because its waves are not synchronized, making it difficult to control. However, using a cascade of holograms, the team structured the otherwise chaotic light waves from the iPhone to form a precise 3D image.
They said this approach could present "a more cost-effective and less complex method" for developing holographic displays using widely available devices. It could also be used to create interfaces for augmented and virtual reality (AR/VR) devices in the future.
"Our method has advantages for applications in compact, cost-effective, and safe near-eye displays, including smart glasses," Horisaki said.
Owen Hughes is a freelance writer and editor specializing in data and digital technologies. Previously a senior editor at ZDNET, Owen has been writing about tech for more than a decade, during which time he has covered everything from AI, cybersecurity and supercomputers to programming languages and public sector IT. Owen is particularly interested in the intersection of technology, life and work – in his previous roles at ZDNET and TechRepublic, he wrote extensively about business leadership, digital transformation and the evolving dynamics of remote work.