https://engineering.princeton.edu/news/2021/11/29/researcher...
The diagram shows two labelled parts that I didn't understand:
a) Glass plate with bandpass filter
b) Near-infrared contact image sensor
The legend doesn't say, but I suspect the diagrams show a distance sensor, not a camera. So I assume the infrared lasers have been omitted from the diagram. Also, I'd quite like to know a bit more about the optical bandpass filter. I suppose any "transparent" material is effectively a bandpass filter; this one presumably passes near-infrared, so is it like the dark-red plastic filter on a TV remote?
Why's the sensor called a "contact" image sensor?
Contact image sensors are image sensors designed to be slapped right up against something. They're used in scanners and surface inspection sensors. No clue how this relates to meta-lenses.
I suspect it's just a bad diagram. Their barrel design is impossible to manufacture.
Source: Optical engineer.
https://www.thorlabs.com/newgrouppage9.cfm?objectgroup_id=10...
Yup, since the optic is planar it can integrate with backend coating processes
> Contact image sensors ... No clue how this relates to meta-lenses.
I'm also not sure how "contact" got into the copy
> I suspect it's just a bad diagram. Their barrel design is impossible to manufacture.
Yea, the barrels end up looking like more traditional barrels
Source: Metalenz CTO
The process starts by illuminating a scene with a monochromatic light source—a
laser.
What that means is that this only works with monochromatic light. The focal distance would be different for each wavelength. That's why it's useful for laser range finding. You would need to illuminate with several different lasers to get a "full color" image, and likely have multiple lenses... although some are tunable.So it's not gonna work for standard glasses, current VR or anything else like that.
Yes. Diffractive optics (which includes meta-lenses) have significant wavelength dependence. Visible light is 400-700nm, which are different by about a factor of 2. This means blue light will focus almost twice as far away as red light does.
The neat bit is this is actually the reverse of how refractive optics behave, which means you can use both together and cancel out a significant portion of chromatic aberration. If we can scale up the manufacturing (and ideally apply them to curved surfaces) they could improve performance and reduce complexity and weight of VR/AR optics.
Spoiler alert: they can't be used to replace your smartphone's camera lenses yet, but can be used for IR distance sensors used on drones and soon, polarization sensors that will be able to tell materials apart and even detect cracks in concrete.
Video of Gavin Smith explaining polarization with a neat mechanical/visual demonstration: https://youtu.be/9SAzxlF57mc?feature=shared&t=128
Not sure if he ever made the camera.