"Time-Folded Optics" Trades Distance for Time in Ultrafast Photography

Ultrafast photography involves the use of ultrafast camera sensors to capture ultrashort pulses of light. This has allowed for things like capturing billions-to-trillions of Frames Per Second images, scanning through closed books, generating depth maps of 3D scenes, and other applications.

The difficulty of such systems is that, in order to function properly, they are subject to significant design constraints, one of which is that its lens must sit at a distance from an imagining sensor equal to or greater than its focal length to capture images, meaning they require very long lenses.

Now, however, MIT Media Lab Researchers appear to have found a way around this, by swapping out distance for time. They were able to modify a type of ultrafast sensor called a streak camera, so that the light entering the lens system is bounced off a series of small mirrors, at each of which an image is captured, with each image corresponding to a particular length of time and, by correlation, distance from the camera.

Each round trip the light pulse makes through the lens system moves the image focal point closer to the lens, thus allowing for a much more compressed lens system than would be possible through conventional optical techniques. This is especially useful for applications of ultrafast photography over great distances, such as for imaging space, or imaging the ground from space.

#Optics #Photography #StreakCamera
Shared publiclyView activity