Want to catch a photon? Start by silencing the sun

Quantum breakthrough uses light’s quirky properties to boost 3D imaging, paving the way for enhanced performance in self-driving cars, medical imaging and deep-space communications

Stevens Institute of Technology

Even with a mesh screen covering an object, (top), Stevens quantum 3D imaging technique that generates images 40,000x clearer (middle) than current technologies (bottom). The technology is the first real-world demonstration of single-photon noise reduction using a method called Quantum Parametric Mode Sorting, or QPMS, which was first proposed by Huang and his team in a 2017 Nature paper. Unlike most noise-filtering tools, which rely on software-based post-processing to clean up noisy images, QPMS checks light's quantum signatures through exotic nonlinear optics to create an exponentially cleaner image at the level of the sensor itself. Detecting a specific information-bearing photon amid the roar of background noise is like trying to pluck a single snowflake from a blizzard -- but that's exactly what Huang's team has managed to do. Huang and colleagues describe a method for imprinting specific quantum properties onto an outgoing pulse of laser light, and then filtering incoming light so that only photons with matching quantum properties are registered by the sensor. The result: an imaging system that is incredibly sensitive to photons returning from its target, but that ignores virtually all unwanted noisy photons. The team's approach yields sharp 3D images even when every signal-carrying photon is drowned out by 34 times as many noisy photons. "By cleaning up initial photon detection, we're pushing the limits of accurate 3D imaging in a noisy environment," said Patrick Rehain, a Stevens doctoral candidate and the study's lead author. "We've shown that we can reduce the amount of noise about 40,000 times better than the top current imaging technologies." That hardware-based approach could facilitate the use of LIDAR in noisy settings where computationally intensive post-processing isn't possible. The technology could also be combined with software-based noise reduction to yield even better results. "We aren't trying to compete with computational approaches -- we're giving them new platforms to work in," Rehain said. Credit Stevens Institute of Technology

Even with a mesh screen covering an object, (top), Stevens quantum 3D imaging technique that generates images 40,000x clearer (middle) than current technologies (bottom). The technology is the first real-world demonstration of single-photon noise reduction using a method called Quantum Parametric Mode Sorting, or QPMS, which was first proposed by Huang and his team in a 2017 Nature paper. Unlike most noise-filtering tools, which rely on software-based post-processing to clean up noisy images, QPMS checks light’s quantum signatures through exotic nonlinear optics to create an exponentially cleaner image at the level of the sensor itself. Detecting a specific information-bearing photon amid the roar of background noise is like trying to pluck a single snowflake from a blizzard — but that’s exactly what Huang’s team has managed to do. Huang and colleagues describe a method for imprinting specific quantum properties onto an outgoing pulse of laser light, and then filtering incoming light so that only photons with matching quantum properties are registered by the sensor. The result: an imaging system that is incredibly sensitive to photons returning from its target, but that ignores virtually all unwanted noisy photons. The team’s approach yields sharp 3D images even when every signal-carrying photon is drowned out by 34 times as many noisy photons. “By cleaning up initial photon detection, we’re pushing the limits of accurate 3D imaging in a noisy environment,” said Patrick Rehain, a Stevens doctoral candidate and the study’s lead author. “We’ve shown that we can reduce the amount of noise about 40,000 times better than the top current imaging technologies.” That hardware-based approach could facilitate the use of LIDAR in noisy settings where computationally intensive post-processing isn’t possible. The technology could also be combined with software-based noise reduction to yield even better results. “We aren’t trying to compete with computational approaches — we’re giving them new platforms to work in,” Rehain said. Credit Stevens Institute of Technology

Researchers at Stevens Institute of Technology have created a 3D imaging system that uses light’s quantum properties to create images 40,000 times crisper than current technologies, paving the way for never-before seen LIDAR sensing and detection in self-driving cars, satellite mapping systems, deep-space communications and medical imaging of the human retina.

The work, led by Yuping Huang, director of the Center for Quantum Science and Engineering at Stevens, addresses a decades old problem with LIDAR, which fires lasers at distant targets, then detects the reflected light. While light detectors used in these systems are sensitive enough to create detailed images from just a few photons – miniscule particles of light that can be encoded with information – it’s tough to differentiate reflected fragments of laser light from brighter background light such as sunbeams.

“The more sensitive our sensors get, the more sensitive they become to background noise,” said Huang, whose work appears in the Feb. 17 advanced online issue of Nature Communications. “That’s the problem we’re now trying to solve.”

The technology is the first real-world demonstration of single-photon noise reduction using a method called Quantum Parametric Mode Sorting, or QPMS, which was first proposed by Huang and his team in a 2017 Nature paper. Unlike most noise-filtering tools, which rely on software-based post-processing to clean up noisy images, QPMS checks light’s quantum signatures through exotic nonlinear optics to create an exponentially cleaner image at the level of the sensor itself.

Detecting a specific information-bearing photon amid the roar of background noise is like trying to pluck a single snowflake from a blizzard — but that’s exactly what Huang’s team has managed to do. Huang and colleagues describe a method for imprinting specific quantum properties onto an outgoing pulse of laser light, and then filtering incoming light so that only photons with matching quantum properties are registered by the sensor.

The result: an imaging system that is incredibly sensitive to photons returning from its target, but that ignores virtually all unwanted noisy photons. The team’s approach yields sharp 3D images even when every signal-carrying photon is drowned out by 34 times as many noisy photons.

“By cleaning up initial photon detection, we’re pushing the limits of accurate 3D imaging in a noisy environment,” said Patrick Rehain, a Stevens doctoral candidate and the study’s lead author. “We’ve shown that we can reduce the amount of noise about 40,000 times better than the top current imaging technologies.”

That hardware-based approach could facilitate the use of LIDAR in noisy settings where computationally intensive post-processing isn’t possible. The technology could also be combined with software-based noise reduction to yield even better results. “We aren’t trying to compete with computational approaches — we’re giving them new platforms to work in,” Rehain said.

In practical terms, QPMS noise reduction could allow LIDAR to be used to generate accurate, detailed 3D images at ranges of up to 30 kilometers. It could also be used for deep-space communication, where the sun’s harsh glare would ordinarily drown out distant laser pulses.

Perhaps most excitingly, the technology could also give researchers a closer look at the most sensitive parts of the human body. By enabling virtually noise-free single-photon imaging, the Stevens imaging system will help researchers create crisp, highly detailed images of the human retina using almost invisibly faint laser beams that won’t damage the eye’s sensitive tissues.

“The single-photon imaging field is booming,” said Huang. “But it’s been a long time since we’ve seen such a big step forward in noise reduction, and the benefits it could impart to so many technologies.”

###

From EurekAlert!

44 thoughts on “Want to catch a photon? Start by silencing the sun

    • Perhaps for nearby objects where it’s practical to illuminate with QPMS light for some kind of active imaging?

      This has me wondering about active imaging for HEL (High Energy Laser) target illumination and beam control. Since the PR talks so much about single photon level I’m guessing there is some catch. Of course the writer is pretty imprecise with the performance terms. Reducing noise isn’t exactly “chrisper”, but “whatever. Saying “raising the SNR” I think most readers would understand, and maybe not think this is somehow creating super resolution

      • Exactly , as usual the idiots trying to talk down to us fail to actually communicate any useful, intelligible information.

        So far all we have “QPMS noise reduction ” and “exponentially cleaner image” which is scientifically meaningless BS. Not even a reference to paper !

        You are correct, this is SNR not better resolution.

        • Not even a reference to paper

          Come on, Greg – there is a link to the EurekAlert article, which has a link to the paper. How spoon-fed do you really need to be?

          I tried to read the paper and was way out of my depth once I got past the introduction.

          Assuming the summary in the abstract is accurate, this is an amazing development and it’s going to revolutionize the remote-sensing business. And all kinds of applications will follow. Law enforcement will love it – really good facial recognition, licence-plate readers (not that I think those would be entirely good developments). Limited only by your imagination.

          • I just watched a few minutes of a financial promotion. Then I read the blurb above, and the “presentation hype” sounded almost identical. What a huge turnoff!

    • I suppose that it would, indirectly…

      If you consider improved, or at least cheaper QA for all their toys, in say, ten years.

      • Cloud cover can already be imaged away. You simple filter the elevation of that you want from the LIDAR signal; clouds obviously are higher than ground features.

        But what happens is that your signal is weak b/c clouds typically absorb so much of your laser (depending on wavelength of course).

        What this is hinting at is that the S/N can be high even with the very reduced signal.

    • Single photons can be cost-effectively processed on Zoosk…though they’ll never, ever find a photon mate there.

  1. That is stupidly cool. One of the more interesting developments I’ve read in a long time. Thanks for sharing it!

  2. Sounds fascinating.

    When you think about the very poor baud rate achievable when sending signals from space probes in our solar system. The signal’s information is filtered from noise via software, but the worse the SNR the slower the baud rate. If the transmission can be done with light and the new method described in the article, instead of radio waves, I could imagine we could get info from from space probes faster and further away than ever before.

    Regarding self driving cars, I still manage to drive my car with hands and feet.

    • I doubt that image quality is the biggest problem.

      Human drivers are remarkably safe, something like 1.2 fatalities per hundred million miles. link As far as I can tell, self driving cars are nowhere near that safe yet and maybe will never be.

          • Peripheral vision isn’t only a wider field of view, but the way we percet the viewed.
            Peripheral vision is a vital component of the visual field. Other than allowing us to see thigns on our sides, it gives a sense of visual perception in crowded areas such as in traffic. Peripheral vision can help people view objects from the corner of their eye, including objects and movement outside of the gaze of the central vision. As compared to central vision, the peripheral vision plays a more effective role in viewing objects in the dark, due to a large number of rods in the peripheral retina.
            Source

          • They can be equipped with multiple eyes or wide angle lenses, but the processing is all that matters and the best they can do is beep, and that is problematic on roads with poor lane striping.
            That’s why your neck swivels.

      • Commie: 1.2 fatalities per million miles of driving ~1 fatality in the distance to the sun! Cool!

        The eye seems to have a ‘filtering’ ability as well in that it sees images clearly on bright sunny days with competing photons bouncing off everything around it. I’m not very clear on how we can read a book so well, isolating the letters so “crisply” with all the photons assaulting the eye from all around us. I’m sure there are readers here who can enighten me on this.

        Does this mean we could ‘coat’ a satellite so that it sends out only identified photons that could be gathered into an image, or see it through a pair of special binoculars?

      • “As far as I can tell, self driving cars are nowhere near that safe yet”

        Alphabet(Google)’s might be acceptably safe. Their approach to autonomous vehicles has been far more conservative than most other players. The trouble is that they achieve their safety by driving like a little old lady. They drive slowly, only take routes they are familiar with. They don’t proceed if anything seems the slightest bit off. I’m told that other drivers — especially those behind their vehicles find this behavior to be extremely annoying. Pretty much exactly like following a school bus.

      • Interesting, and when you eliminate those driving under the influence, that safety record must improve dramatically.

  3. Well, sun wan’t be silenced, it is just having a bit of a rest with another 24 spotless days. No sign of the SC25, despite one or two spots in December and January. It’s just wait and see time.

  4. Another bit of technology trying to emulate a feat life achieved and refined over eons.

    “detection in self-driving cars”

    “The result: an imaging system that is incredibly sensitive to photons returning from its target, but that ignores virtually all unwanted noisy photons.”

    Oh?
    What happens when the road is filled with self driving systems spraying the area with ‘QPMS light’s quantum signatures’ enhanced light?

    It sounds like their system might work well when there is only one system lighting up targets; provided the time it takes for processing images is quick.
    But, when everything is shining ‘QPMS light’s quantum signature’ enhanced and reflected light around the area. They will end up at the same problem resolving light washed targets, and possibly dealing with light arriving direct from ‘QPMS light’s quantum signatures’ sources.

  5. Perhaps we can use this technology to see deep into the brains of alarmists so we can determine what’s wrong.

  6. Hmmm… or hide a signal inside a beam of light, or create different ‘VLANs’ in the same beam using QPMS for the tagging.

  7. This press release is strong on hype, short on details. The original article is https://www.nature.com/articles/s41467-020-14591-8.

    At the first glance, they store some photons from a laser pulse (all photons in the pulse are identical) and detect only photons that correspond to the stored photons. How exactly that’s done, I don’t understand yet.

    • That was my question. How do they put a Quantum imprint on a photon that survives the round trip.

  8. First – run this horse by us again :
    “QPMS checks light’s quantum signatures through exotic nonlinear optics” – Kerr Cells or what?
    Second they have moved the signal processing out of the traditional computer to an actual physical system, so quantum computing. Analog with a major difference.
    Thirdly, we already know living systems are optically active – see Gurwitsch mitogenic radiation, extremely weak, noisy, but UV photons. shades of the great Pasteur.
    So the investigation of actual living processes, cell replication, optically, is opened up! Cancer research anyone?

    And what to the debased culturally decadent willing lick-spittle’s of high finance do? They use it to push police-state facial recognition or autonomous autos!

Comments are closed.