We are living in a time when mirrorless cameras have become a reality and now scientists are working towards creating cameras without lenses.
That’s right! Researchers at Massachusetts Institute of Technology (MIT) are working on and experimented with a technology known as compressed sensing. For all of you who are now aware, compressed sensing is a technique by which scientists can extract large amounts of information from a single signal. This particular technology has already been used by Rice University scientists to develop a camera that can capture a 2D image using only a single light sensor, instead of the millions of sensors that are found in currently used consumer cameras. Utilising this technology in cameras could pave way for a massive breakthrough that will revolutionise cameras as we know them.
Researchers at the institute revealed recently that they have managed to develop a working single-pixel camera that
is 50 times more efficient than previously developed single-pixel cameras. This particular camera requires only a dozen exposures to generate an image instead of thousands required by previous single-pixel cameras.
The technology prior to the MIT study wasn’t very efficient because to create just one images, thousands of exposures were required. Now, MIT researchers have developed a new technique that makes using compressed sensing 50 times more efficient.
Scientists say that while their research is still not going to yield a camera without lenses any time soon, it does pave way for research into this direction and that too at far greater speed.
The compressed sensing technology that effectively is computation imaging questions the need for lens in a camera. If we look at current camera, the lens effectively map pixels in space to sensors in an array, with everything precisely structured and engineered. However, with compressed sensing that’s not the case.
Scientist question whether the sensor have to be a structured array and how many pixels should the sensor have? Is a single pixel sufficient? These questions essentially break down the fundamental idea of what a camera is, say scientists at MIT.
The future of lens equipped cameras as we know them today isn’t too bright as the latest study indicates that cameras with compressed sensing wouldn’t require lenses, and could therefore be useful in harsh environments, or to even view wavelengths outside the visible spectrum. That means your smartphone camera could one day have easily built-in infrared capability — unlike the bulky Cat S60.
In addition, the technology has massive implications for other types of cameras. It uses something called “time of flight imaging” where the camera’s ultra fast sensor measures how quickly a short burst of light is reflected back. Think of it like the echo navigation in sonar, except for light. This research could then tie into building efficient cameras that can see around corners, or even improving the navigational capability of self-driving cars.
Not to mention being able to set your smartphone down on a flat surface without worrying about scratching the lens.