Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

Shoot a picture first, focus later
New Scientist ^ | 11/16/05 | Celeste Biever

Posted on 11/17/2005 11:50:52 AM PST by LibWhacker

BLURRY snaps could be a thing of the past with the development of a digital camera that refocuses photos after they have been taken.

The camera could be useful for action shots taken by sports photographers or for CCTV surveillance cameras, which often produce fuzzy shots due to poor lighting.

In an ordinary digital camera, a sensor behind the lens records the light level that hits each pixel on its surface. If the light rays reaching the sensor are not in focus, the image will appear blurry.

Now, Pat Hanrahan and his team at Stanford University have figured out how to adjust the light rays after they have reached the camera. They inserted a sheet of 90,000 lenses, each just 125 micrometres across, between the camera's main lens and the image sensor. The angle of the light rays that strike each microlens is recorded, as well as the amount of light arriving along each ray.

Software can then be used to adjust these values for each microlens to reconstruct what the image would have looked like if it had been properly focused. That also means any part of the image can be refocused - not just the main subject.

Tracing the rays like this removes the conventional trade-off between the aperture size, which controls the amount of light that the camera takes in, and the depth of field. If light is low, a larger aperture will let enough light into the camera to form a clear image, but the laws of optics mean that a narrower slice of the world in front of the camera will appear in focus.

Hanrahan's system would be particularly useful for surveillance cameras, which must work at night but also need to have objects in focus at different distances from the camera.


TOPICS: News/Current Events; Technical
KEYWORDS: blurry; camera; cameras; cctv; digital; focus; images; later; lenses; light; microlens; picture; rays; refocus; sheet; shoot; surveillance
Navigation: use the links below to view more comments.
first 1-2021-34 next last
Very important development for homeland security, surveillance, face recognition, etc. (you name it), wow!
1 posted on 11/17/2005 11:50:55 AM PST by LibWhacker
[ Post Reply | Private Reply | View Replies]

To: LibWhacker

amazing concept, although it sounds like it will be extremely processor intensive to make such calculations on data from so many sensors, real time.


2 posted on 11/17/2005 11:53:50 AM PST by z3n
[ Post Reply | Private Reply | To 1 | View Replies]

To: LibWhacker

Too bad the Hubble Space camera didn't have one of these - could have saved few billion.


3 posted on 11/17/2005 11:54:19 AM PST by governsleastgovernsbest (Watching the Today Show since 2002 so you don't have to.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: LibWhacker

I'm in a circle of confusion over this.


4 posted on 11/17/2005 11:55:17 AM PST by SpaceBar
[ Post Reply | Private Reply | To 1 | View Replies]

To: SpaceBar

That's okay; I am, too! I hope all the physicists and photographers here on FR pipe in soon and clear up some things for us. For instance, I'd like to know what kind of quality we can expect out of this sheet (plastic sheet?) of 90,000 microlenses? You don't want to shove a sheet of cheap plastic in front of a $10,000 lens!


5 posted on 11/17/2005 12:02:23 PM PST by LibWhacker
[ Post Reply | Private Reply | To 4 | View Replies]

To: LibWhacker

In all seriousness though, haven't insects been doing something like this for a while?


6 posted on 11/17/2005 12:08:11 PM PST by SpaceBar
[ Post Reply | Private Reply | To 5 | View Replies]

To: z3n
amazing concept, although it sounds like it will be extremely processor intensive to make such calculations on data from so many sensors, real time.

If I'm reading it right, it doesn't *have* to be processed in "real time". Instead, the system just takes a "snapshot" (no pun intended) of the light intensities and angles at the moment the shutter is triggered, and then this records all the information necessary to produce images at any *later* time via processing of that dataset.

7 posted on 11/17/2005 12:08:45 PM PST by Ichneumon
[ Post Reply | Private Reply | To 2 | View Replies]

To: LibWhacker

"I'd like to know what kind of quality we can expect out of this sheet (plastic sheet?) of 90,000 microlenses? "

Microlenses are not new. Most CCD devices contain them. DLP technology used in some video projectors today contains millions of tiny mirrors on pivots and moves them as needed to gather correct amounts of red, blue, and green light. "They" can do some pretty amazing and tiny things these days! Also, $10K won't even touch a broadcast quality HDTV lens. Heck, it can cost that to fix one!


8 posted on 11/17/2005 12:10:08 PM PST by bk1000 (A clear conscience is a sure sign of a poor memory)
[ Post Reply | Private Reply | To 5 | View Replies]

To: SpaceBar
In all seriousness though, haven't insects been doing something like this for a while?

What? Designing and building digital cameras? (Not at all serious.)

9 posted on 11/17/2005 12:10:55 PM PST by OSHA (Liberalism - Is it real or is it Scrappleface?)
[ Post Reply | Private Reply | To 6 | View Replies]

To: governsleastgovernsbest
Too bad the Hubble Space camera didn't have one of these - could have saved few billion.

It did, but the in focus pics are still sharper. Hubble was corrected with software prior to the repair. If you know the exact physical characteristics of a lens you can write software to correct its imperfections.

10 posted on 11/17/2005 12:12:13 PM PST by js1138 (Great is the power of steady misrepresentation.)
[ Post Reply | Private Reply | To 3 | View Replies]

To: SpaceBar

Absolutely. The common housefly has an eye like that. Now if the fly's tiny brain is somehow hardwired to do what these Stanford reseachers have done with their software, then I am very impressed, and it's a hopeful sign for the future of this technology.


11 posted on 11/17/2005 12:14:15 PM PST by LibWhacker
[ Post Reply | Private Reply | To 6 | View Replies]

To: SpaceBar

Shutter bugs?


12 posted on 11/17/2005 12:19:51 PM PST by Eagle Eye (There ought to be a law against excess legislation.)
[ Post Reply | Private Reply | To 6 | View Replies]

To: js1138

You'll probably tell me to get my tinfoil hat on, but I've always thought that the Hubble was intentionally flawed when it was launched. It was just too convenient that NASA was able to so easily come up with a fix that just happened to require a spacewalk to fix, this coming at a time when NASA was beginning to be seen as irrelevant and unnecessary. Suddenly, NASA was able to prove that they could repair a massive telescope in space! Too much coincidence for me.


13 posted on 11/17/2005 12:24:34 PM PST by RightFighter
[ Post Reply | Private Reply | To 10 | View Replies]

To: LibWhacker

So the next time you kill a fly, you are quite possibly wiping out a tiny supercomputer optimized to convolve a gabor transform kernel on the cached hits of an orthogonal range search traversal of a photon map. Something to think about.


14 posted on 11/17/2005 12:25:45 PM PST by SpaceBar
[ Post Reply | Private Reply | To 11 | View Replies]

To: SpaceBar

Worse, news of this put my sister into a coma.


15 posted on 11/17/2005 12:34:52 PM PST by coloradan (Hence, etc.)
[ Post Reply | Private Reply | To 4 | View Replies]

To: js1138

True, but you can never recover signal lost in noise.


16 posted on 11/17/2005 12:36:08 PM PST by coloradan (Hence, etc.)
[ Post Reply | Private Reply | To 10 | View Replies]

To: z3n
amazing concept, although it sounds like it will be extremely processor intensive to make such calculations on data from so many sensors, real time.

They'll have a dedicated image processing IC to handle these in no time. It will probably be nearly instantaneous.

17 posted on 11/17/2005 12:37:58 PM PST by TChris ("Unless you act, you're going to lose your world." - Mark Steyn)
[ Post Reply | Private Reply | To 2 | View Replies]

To: coloradan

Not quite true. You can recover a signal that is below the noise level if you know the characteristics of the signal. GPS receivers do this.


18 posted on 11/17/2005 12:38:18 PM PST by js1138 (Great is the power of steady misrepresentation.)
[ Post Reply | Private Reply | To 16 | View Replies]

To: LibWhacker
Software can then be used to adjust these values

Hopefully the software can decide not to focus, if, say, Hillary were to be in the background.

19 posted on 11/17/2005 12:39:15 PM PST by aimhigh
[ Post Reply | Private Reply | To 1 | View Replies]

To: js1138
That's true, but you have the advantage of knowing exactly what the GPS signal looks like. The only reason to take astronomical research photos is because you don't know what's out there, which is what you're trying to find out. If a dim star's light is spread out over many pixels due to aberrations, giving a signal level below each pixel's noise level, you can't recover that with software, but you might have been able to detect the star had all the light collected in one pixel only. You can make statistical arguments about the likelihood of there having been a star there, but you can't refocus.
20 posted on 11/17/2005 12:45:31 PM PST by coloradan (Hence, etc.)
[ Post Reply | Private Reply | To 18 | View Replies]


Navigation: use the links below to view more comments.
first 1-2021-34 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson