Electronics

New technology from MIT may enable cheap, color, holographic video displays

View 8 Images
A butterfly imaged on MIT's new holographic display
A butterfly imaged on MIT's new holographic display
Hologram of a mouse as viewed from different angles (Photo: Georg-Johann Ley via Wikimedia)
Pieces of a broken hologram showing different views of the same object (Photo: Epzcaw via Wikimedia)
A raw hologram of a toy van. The image is stored in the interference pattern caught on the film shown (Photo: Epzcaw via Wikimedia)
Recording a Hologram: Light generated by a laser is split into two paths. One path is directed onto the object being holographed, and the other onto a sheet of film. The interference pattern between the object beam and the reference beam is the hologram (Image: Bob Mellish via Wikipedia)
Viewing a Hologram: A reconstruction beam of laser light is directed onto the hologram at the same angle as was the reference beam during the original exposure. The reconstruction beam is diffracted by the hologram, and produces a virtual image of the object at the original location (Image: Bob Mellish via Wikipedia)
Leaky-mode light modulator (Image: MIT Media Lab)
Holographic image of a Princess Leia stand-in shown using MIT's Mark-II holographic projector (Photo: MIT Media Lab)
View gallery - 8 images

Researchers at MIT’s Media Lab have developed a new form of holographic projector that may enable the introduction of practical color 3D holographic video displays as well as higher-resolution 2D displays with lower power consumption. The new projector is built using principles of guided wave optics to construct the spatial light modulator (SLM) that is the heart of digital holography. The MIT holographic projector, which contains an SLM costing US$10 to fabricate, provides 3D images at 30 frames per second (fps) with a resolution similar to that of a standard-definition TV.

"A New Microscopic Principle" hardly seems the harbinger of what may well become the most fundamental change in data communication since the introduction of the television, but that is the name that Dennis Gabor gave to his 1948 research paper announcing his invention of holography. While used immediately in electron microscopy, the potential for optical holograms languished until the 1960 invention of the laser.

Recording a Hologram: Light generated by a laser is split into two paths. One path is directed onto the object being holographed, and the other onto a sheet of film. The interference pattern between the object beam and the reference beam is the hologram (Image: Bob Mellish via Wikipedia)

A hologram is recorded by exposing a light-sensitive sensor (for example, photographic film, or a high-resolution CCD) simultaneously to a coherent beam of light and the reflection of that beam of light from the scene being recorded. The sensor records not an image of the scene, but the interference (typically taking place at the surface of a sheet of film) between the image and the original coherent light. This interference pattern contains all the information in the light field at the sensor.

To play back a hologram, the interference pattern of the original hologram is reproduced, and a coherent beam of light (typically having the same wavelength as the original laser illumination source) is directed onto the pattern along the same direction as was the reference beam. The reconstruction beam is diffracted from the interference pattern, and thereby reproduces the 3D image information of the subject of the hologram. For us, a glowing but seemingly solid image suddenly appears floating in space.

Viewing a Hologram: A reconstruction beam of laser light is directed onto the hologram at the same angle as was the reference beam during the original exposure. The reconstruction beam is diffracted by the hologram, and produces a virtual image of the object at the original location (Image: Bob Mellish via Wikipedia)

With video displays being of considerably greater value than static 3D picture frames, a dynamic substitute for photographic film has long been sought, with varying degrees of success. An active holographic display is based on a spatial light modulator (SLM), a device that changes the intensity and/or the phase of a beam of light. A simple example is an overhead projector, wherein the transparency acts as an SLM.

The SLM's found in previous active displays, such as this group's earlier Mark-II holographic display, have relied on liquid crystals, MEMS-based micromirror arrays, and bulk acousto-optic devices. However, these all have limitations for holographic video, including small bandwidth, large feature sizes, which leads to limited diffraction of light, and various forms of noise. Multiplexing to obtain color images is also quite difficult with these SLM devices.

Holographic image of a Princess Leia stand-in shown using MIT's Mark-II holographic projector (Photo: MIT Media Lab)

To avoid these problems, the group at MIT Media Lab has developed an anisotropic leaky-mode integrated optics SLM on a lithium niobate chip. Lithium niobate is the primary substrate for integrated optics, in which waveguides, modulators, and switches are built into the lithium niobate surface.

In the new MIT SLM, a narrow optical waveguide is formed in a lithium niobate chip using ion implantation. The extra ions make the crystal denser, causing the narrow channel to guide light along its length. This waveguide has one optical mode that directs light along it, and a second "leaky" mode having the opposite polarization that is not guided, but is directed through the bottom of the crystal, which is where the holographic image will be generated.

Leaky-mode light modulator (Image: MIT Media Lab)

In MIT's new design, light of the proper polarization is injected into the SLM so that it is guided by the waveguide. Lithium niobate is piezoelectric, so when a radio frequency (RF) is applied to the electrodes, it generates an acoustic wave along the crystal surface. When the RF is off, the light stays in the waveguide, and no light penetrates into the holograph. When the RF is on, the interaction of the acoustic waves and the light in the waveguide causes the polarization to rotate, so that the light is directed into forming the holographic image. This method is immediately capable of reproducing multicolor holographs.

The MIT group has demonstrated a holographic display that can display 3D images with a resolution of about 400 x 400 x 138 pixels at five frames per second, as seen in the initial picture above. They are presently making a prototype with 1,250 channels (leaky-wave waveguides), with which they expect to achieve a bandwidth of 125 Gigapixels/sec, generating an image of a cube about 1,500 pixels on a side.

"What's most exciting about [the new chip] is that it’s a waveguide-based platform, which is a major departure from every other type of spatial light modulator used for holographic video right now," says Daniel Smalley, lead author on the research paper published in Nature. "One of the big advantages here is that you get to use all the tools and techniques of integrated optics. Any problem we’re going to meet now in holographic video displays, we can feel confidence that there’s a suite of tools to attack it, relatively simply.”

Don't expect holographic televisions tomorrow, but overall the prospect is looking up.

Source: MIT Media Lab

View gallery - 8 images
  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
4 comments
kelvint63
This is something I've been thinking about for years; can 2 laser beams merging at a 90 degree be used to create a hologram? If they can use the beam to draw an image on a backdrop or even smoke, why can't they use another laser beam for the backdrop? Computers have gotten fast enough and precise enough to have the two beams constantly inter-connect as they're both moved around to draw an image.
kalqlate
@kelvint63 - As far as I know, photons do not interact with photons.
Franc
Photons do not interact with photons. However, this is not how the image is created. The image is created on the photographic plate but the image appears to hang in space - it is a virtual image.
Joel Detrow
Franc, while it's true that photons don't typically interact with each other (because a photon is the particle through which other particles interact with light), light waves do interact with each other - the double-slit experiment proves this. Furthermore, photons DO interact with our eyes, and our brain interprets this input to create our vision. Even if individual photons weren't interacting, they could easily produce the illusion of a third dimension if controlled the right way.