Optical blur, or point spread function (PSF) is an umbrella term for a laundry list of image degrading effects such as defocus, diffraction, and aberrations. It’s hard to calibrate because it varies with sensor position, focus, target distance, and where you look on the image plane. We introduce Lens Blur Fields: tiny MLPs that can model this high-dimensional PSF.
Two smartphones of the same make can have subtly different PSFs. We show this with the lens blur fields of two iPhone 12 Pros:
We provide a first‑of‑its‑kind dataset of 5‑D blur fields—for smartphone cameras, camera bodies equipped with a variety of lenses, etc. Finally, we show that acquired 5‑D blur fields are expressive and accurate enough to reveal, for the first time, differences in optical behavior of smartphone devices of the same make and model.
Seems like cameras have fingerprint-like characteristics that can tell exactly which camera took a certain photo, just like permanent watermarks
