Posted on 05/04/2019 12:20:44 AM PDT by LibWhacker
Developed in China, the lidar-based system can cut through city smog to resolve human-sized features at vast distances.
Long-distance photography on Earth is a tricky challenge. Capturing enough light from a subject at great distances is not easy. And even then, the atmosphere introduces distortions that can ruin the image; so does pollution, which is a particular problem in cities. That makes it hard to get any kind of image beyond a distance of a few kilometers or so (assuming the camera is mounted high enough off the ground to cope with Earths curvature).
But in recent years, researchers have begun to exploit sensitive photodetectors to do much better. These detectors are so sensitive they can pick up single photons and use them to piece together images of subjects up to 10 kilometers (six miles) away.
Nevertheless, physicists would love to improve even more. And today, Zheng-Ping Li and colleagues from the University of Science and Technology of China in Shanghai show how to photograph subjects up to 45 km (28 miles) away in a smog-plagued urban environment. Their technique uses single-photon detectors combined with a unique computational imaging algorithm that achieves super-high-resolution images by knitting together the sparsest of data points.
The new technique is relatively straightforward in principle. It is based on laser ranging and detection, or lidarilluminating the subject with laser light and then creating an image from reflected light.
The big advantage of this kind of active imaging is that the photons reflected from the subject return to the detector within a specific time window that depends on the distance. So any photons that arrive outside this window can be ignored.
This gating dramatically reduces the noise created by unwanted photons from elsewhere in the environment. And it allows lidar systems to be highly sensitive and distance specific.
To make the new system even better in urban environments, Zheng-Ping and co use an infrared laser with a wavelength of 1550 nanometers, a repetition rate of 100 kilohertz, and a modest power of 120 milliwatts. This wavelength makes the system eye-safe and allows the team to filter out solar photons that would otherwise overwhelm the detector.
The researchers send and receive these photons through the same optical apparatusan ordinary astronomical telescope with an aperture of 280 mm. The reflected photons are then detected by a commercial single-photon detector. To create an image, the researchers scan the field of view using a piezo-controlled mirror that can tilt up, down, and side to side.
In this way, they can create two-dimensional images. But by changing the gating timings, they can pick up photons reflected from different distances to build a 3D image.
The final advance the team has made is to develop an algorithm that knits an image together using the single-photon data. This kind of computational imaging has advanced in leaps and bounds in recent years, allowing researchers to create images from relatively small sets of data.
The results speak for themselves. The team set up the new camera on the 20th floor of a building on Chongming Island in Shanghai and pointed it at the Pudong Civil Aviation Building across the river, some 45 km away. single pixel resolution imaging
Conventional images taken through the telescope show nothing other than noise. But the new technique produces images with a spatial resolution of about 60 cm, which resolves building windows. This result demonstrates the superior capability of the near-infrared single-photon LiDAR system to resolve targets through smog, say the team.
Thats also significantly better than the conventional diffraction limit of 1 meter at 45 km, and certainly better than other recently developed algorithms. The image here shows the potential of the technique in images taken in daylight from a distance of 21 km. Our results open a new venue for high-resolution, fast, low-power 3D optical imaging over ultralong ranges, say Zheng-Ping and co.
Thats interesting work that has a wide range of applications. The team mention remote sensing, airborne surveillance, and target recognition and identification. Indeed, the entire device is about the size of a large shoebox and so is relatively portable.
And Zheng-Ping and co say it can be significantly improved. Our system is feasible for imaging at a few hundreds of kilometers by refining the setup, and thus represents a significant milestone towards rapid, low-power, and high-resolution LiDAR over extra-long ranges, they say.
So keep smilingthey may be watching.
Very interesting, but one question....
At 45 kilometers, how do you know where to point the telescope to find your subject?
Well, drones and Alexa werent bad enough now this!
Privacy is pretty much dead.
If your target has a permanent address it shouldnt be to big a problem.
For instance, if the Seals had this available when they were after Osama Ben Laden they could have used this to confirm his presence in the compound after they suspected he was there.
They could have set it up on a mountain that looked down on the compound.
That mirror ball cap sure does the trick.
Individual photons are pretty small.
So that is what is going to be in the next Huawei P40...
Photons are fascinating little suckers. They have no mass, no time, and can last for a really, really long time. There's so much more we can/might learn about those little buggers.
Can it resolve an image of my middle finger being raised at them?
Yeah, It’s a conundrum...
Some people look much better from 45km away...lol
i.e. Jerry Nadler
How can you hear the person yell “say cheese!”?
Simple. You just record everything and sort it out later. /NSA
This sounds as though it is using the terahertz technology used in spy cameras. They function in the gap between microwave and infrared.
Only if your middle finger is one meter long.
bkmk
At 45 kilometers, how do you know where to point the telescope to find your subject?"
Even more interesting is that the "curve of the earth" would prohibit any photography shot from a straight line of sight from 45 km or about 28 miles away of anything but a building about half the height of the Empire State Building.
If you were trying to shoot something 45 km or 28 miles away, your object would be about 520 feet under the horizon line (somewhat less if you held the camera at eye height of five feet) or "under the curve" ... completely outside any direct line of sight, unless you were shooting a building that was taller than 520 feet...
and then you would only see the very top of the building at 28 miles away: http://earthcurvature.com/
https://dizzib.github.io/earth/curve-calc/?d0=28&h0=6&unit=imperial
Earth Curvature Calculator
by NyttNorge.com
Accurately calculate the curvature you are supposed to see on the ball Earth.
Distance.........Curvature
1 mile...........0.00013 miles = 0.67 feet
10 miles.........0.01263 miles = 66.69 feet
50 miles.........0.31575 miles = 1667.17 feet
100 miles .......1.26296 miles = 6668.41 feet
200 miles....... 5.05102 miles = 26669.37 feet
500 miles .......31.5336 miles = 166497.53 feet
1000 miles.......125.632 miles = 663337.65 feet
Explanation: The Earth's radius (r) is 6371 km or 3959 miles, based on numbers from Wikipedia, which gives a circumference (c)of c = 2 * π * r = 40 030 km
We wish to find the height (h) which is the drop in curvature over the distance (d)
Using the circumference we find that 1 kilometer has the angle 360° / 40 030 km = 0.009°.
The angle (a) is then a = 0.009° * distance (d)
The derived formula h = r * (1 - cos a) is accurate for any distance (d)
anonymity is just about gone as well.
Facial recognition software will be perfected soon.
Pair that with this camera and you every move will be documented, from birth to death.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.